Here is Pass4sure 000-M90 exam prep updated today

We have legitimate and state-of-the-art 000-M90 study guide and braindumps. killexams.com gives the specific and latest 000-M90 Exam Cram with braindumps which essentially contain all data that you really want to finish the 000-M90 test. With the aid of our 000-M90 test dumps, you Do not need to chance your chance on perusing reference books yet basically need to consume 10-20 hours to retain our 000-M90 Exam Cram and replies.

Exam Code: 000-M90 Practice exam 2022 by Killexams.com team
IBM Content Analytics Theory Technical Mastery Test v1
IBM Analytics questions
Killexams : IBM Analytics questions - BingNews https://killexams.com/pass4sure/exam-detail/000-M90 Search results Killexams : IBM Analytics questions - BingNews https://killexams.com/pass4sure/exam-detail/000-M90 https://killexams.com/exam_list/IBM Killexams : Answering the top 10 questions about supercloud

As we exited the isolation economy last year, we introduced supercloud as a term to describe something new that was happening in the world of cloud computing.

In this Breaking Analysis, we address the ten most frequently asked questions we get on supercloud. Today we’ll address the following frequently asked questions:


1. In an industry full of hype and buzzwords, why does anyone need a new term?

2. Aren’t hyperscalers building out superclouds? We’ll try to answer why the term supercloud connotes something different from a hyperscale cloud.

3. We’ll talk about the problems superclouds solve.

4. We’ll further define the critical aspects of a supercloud architecture.

5. We often get asked: Isn’t this just multicloud? Well, we don’t think so and we’ll explain why.

6. In an earlier episode we introduced the notion of superPaaS  – well, isn’t a plain vanilla PaaS already a superPaaS? Again – we don’t think so and we’ll explain why.

7. Who will actually build (and who are the players currently building) superclouds?

8. What workloads and services will run on superclouds?

9. What are some examples of supercloud?

10. Finally, we’ll answer what you can expect next on supercloud from SiliconANGLE and theCUBE.

Why do we need another buzzword?

Late last year, ahead of Amazon Web Services Inc.’s re:Invent conference, we were inspired by a post from Jerry Chen called Castles in the Cloud. In that blog he introduced the idea that there were submarkets emerging in cloud that presented opportunities for investors and entrepreneurs, that the big cloud vendors weren’t going to suck all the value out of the industry. And so we introduced this notion of supercloud to describe what we saw as a value layer emerging above the hyperscalers’ “capex gift.”

It turns out that we weren’t the only ones using the term, as both Cornell and MIT have used the phrase in somewhat similar but different contexts.

The point is something new was happening in the AWS and other ecosystems. It was more than infrastructure as a service and platform as a service and wasn’t just software as a service running in the cloud.

It was a new architecture that integrates infrastructure, unique platform attributes and software to solve new problems that the cloud vendors in our view weren’t addressing by themselves. It seemed to us that the ecosystem was pursuing opportunities across clouds that went beyond conventional implementations of multi-cloud.

In addition, we felt this trend pointed to structural change going on at the industry level that supercloud metaphorically was highlighting.

So that’s the background on why we felt a new catchphrase was warranted. Love it or hate it… it’s memorable.

Industry structures have always mattered in tech

To that last point about structural industry transformation: Andy Rappaport is sometimes credited with identifying the shift from the vertically integrated mainframe era to the horizontally fragmented personal computer- and microprocessor-based era in his Harvard Business Review article from 1991.

In fact, it was actually David Moschella, an International Data Corp. senior vice president at the time, who introduced the concept in 1987, a full four years before Rappaport’s article was published. Moschella, along with IDC’s head of research Will Zachmann, saw that it was clear Intel Corp., Microsoft Corp., Seagate Technology and other would replace the system vendors’ dominance.

In fact, Zachmann accurately predicted in the late 1980s the demise of IBM, well ahead of its epic downfall when the company lost approximately 75% of its value. At an IDC Briefing Session (now called Directions), Moschella put forth a graphic that looked similar to the first two concepts on the chart below.

We don’t have to review the shift from IBM as the epicenter of the industry to Wintel – that’s well-understood.

What isn’t as widely discussed is a structural concept Moschella put out in 2018 in his book “Seeing Digital,” which introduced the idea of the Matrix shown on the righthand side of this chart. Moschella posited that a new digital platform of services was emerging built on top of the internet, hyperscale clouds and other intelligent technologies that would define the next era of computing.

He used the term matrix because the conceptual depiction included horizontal technology rows, like the cloud… but for the first time included connected industry columns. Moschella pointed out that historically, industry verticals had a closed value chain or stack of research and development, production, distribution, etc., and that expertise in that specific vertical was critical to success. But now, because of digital and data, for the first time, companies were able to jump industries and compete using data. Amazon in content, payments and groceries… Apple in payments and content… and so forth. Data was now the unifying enabler and this marked a changing structure of the technology landscape.

Listen to David Moschella explain the Matrix and its implications on a new generation of leadership in tech.

So the term supercloud is meant to imply more than running in hyperscale clouds. Rather, it’s a new type of digital platform comprising a combination of multiple technologies – enabled by cloud scale – with new industry participants from financial services, healthcare, manufacturing, energy, media and virtually all industries. Think of it as kind of an extension of “every company is a software company.”

Basically, thanks to the cloud, every company in every industry now has the opportunity to build their own supercloud. We’ll come back to that.

Aren’t hyperscale clouds superclouds?

Let’s address what’s different about superclouds relative to hyperscale clouds.

This one’s pretty straightforward and obvious. Hyperscale clouds are walled gardens where they want your data in their cloud and they want to keep you there. Sure, every cloud player realizes that not all data will go to their cloud, so they’re meeting customers where their data lives with initiatives such Amazon Outposts and Azure Arc and Google Anthos. But at the end of the day, the more homogeneous they can make their environments, the better control, security, costs and performance they can deliver. The more complex the environment, the more difficult to deliver on their promises and the less margin left for them to capture.

Will the hyperscalers get more serious about cross cloud services? Maybe, but they have plenty of work to do within their own clouds. And today at least they appear to be providing the tools that will enable others to build superclouds on top of their platforms. That said, we never say never when it comes to companies such as AWS. And for sure we see AWS delivering more integrated digital services such as Amazon Connect to solve problems in a specific domain, call centers in this case.

What problems do superclouds solve?

We’ve all seen the stats from IDC or Gartner or whomever that customers on average use more than one cloud. And we know these clouds operate in disconnected silos for the most part. That’s a problem because each cloud requires different skills. The development environment is different, as is the operating environment, with different APIs and primitives and management tools that are optimized for each respective hyperscale cloud. Their functions and value props don’t extend to their competitors’ clouds. Why would they?

As a result, there’s friction when moving between different clouds. It’s hard to share data, move work, secure and govern data, and enforce organizational policies and edicts across clouds.

Supercloud is an architecture designed to create a single environment that enables management of workloads and data across clouds in an effort to take out complexity, accelerate application development, streamline operations and share data safely irrespective of location.

Pretty straightforward, but nontrivial, which is why we often ask company chief executives and execs if stock buybacks and dividends will yield as much return as building out superclouds that solve really specific problems and create differentiable value for their firms.

What are the critical attributes of a supercloud?

Let’s dig in a bit more to the architectural aspects of supercloud. In other words… what are the salient attributes that define supercloud?

First, a supercloud runs a set of specific services, designed to solve a unique problem. Superclouds offer seamless, consumption-based services across multiple distributed clouds.

Supercloud leverages the underlying cloud-native tooling of a hyperscale cloud but it’s optimized for a specific objective that aligns with the problem it’s solving. For example, it may be optimized for cost or low latency or sharing data or governance or security or higher performance networking. But the point is, the collection of services delivered is focused on unique value that isn’t being delivered by the hyperscalers across clouds.

A supercloud abstracts the underlying and siloed primitives of the native PaaS layer from the hyperscale cloud and using its own specific platform-as-a-service tooling, creates a common experience across clouds for developers and users. In other words, the superPaaS ensures that the developer and user experience is identical, irrespective of which cloud or location is running the workload.

And it does so in an efficient manner, meaning it has the metadata knowledge and management that can optimize for latency, bandwidth, recovery, data sovereignty or whatever unique value the supercloud is delivering for the specific use cases in the domain.

A supercloud comprises a superPaaS capability that allows ecosystem partners to add incremental value on top of the supercloud platform to fill gaps, accelerate features and innovate. A superPaaS can use open tooling but applies those development tools to create a unique and specific experience supporting the design objectives of the supercloud.

Supercloud services can be infrastructure-related, application services, data services, security services, users services, etc., designed and packaged to bring unique value to customers… again that the hyperscalers are not delivering across clouds or on-premises.

Finally, these attributes are highly automated where possible. Superclouds take a page from hyperscalers in terms of minimizing human intervention wherever possible, applying automation to the specific problem they’re solving.

Isn’t supercloud just another term for multicloud?

What we’d say to that is: Perhaps, but not really. Call it multicloud 2.0 if you want to invoke a commonly used format. But as Dell’s Chuck Whitten proclaimed, multicloud by design is different than multicloud by default.

What he means is that, to date, multicloud has largely been a symptom of multivendor… or of M&A. And when you look at most so-called multicloud implementations, you see things like an on-prem stack wrapped in a container and hosted on a specific cloud.

Or increasingly a technology vendor has done the work of building a cloud-native version of its stack and running it on a specific cloud… but historically it has been a unique experience within each cloud with no connection between the cloud silos. And certainly not a common developer experience with metadata management across clouds.

Supercloud sets out to build incremental value across clouds and above hyperscale capex that goes beyond cloud compatibility within each cloud. So if you want to call it multicloud 2.0, that’s fine.

We choose to call it supercloud.

Isn’t plain old PaaS already supercloud?

Well, we’d say no. That supercloud and its corresponding superPaaS layer gives the freedom to store, process, manage, secure and connect islands of data across a continuum with a common developer experience across clouds.

Importantly, the sets of services are designed to support the supercloud’s objectives – e.g., data sharing or data protection or storage and retrieval or cost optimization or ultra-low latency, etc. In other words, the services offered are specific to that supercloud and will vary by each offering. OpenShift, for example, can be used to construct a superPaaS but in and of itself isn’t a superPaaS. It’s generic.

The point is that a supercloud and its inherent superPaaS will be optimized to solve specific problems such as low latency for distributed databases or fast backup and recovery and ransomware protection — highly specific use cases that the supercloud is designed to solve for.

SaaS as well is a subset of supercloud. Most SaaS platforms either run in their own cloud or have bits and pieces running in public clouds (e.g. analytics). But the cross-cloud services are few and far between or often nonexistent. We believe SaaS vendors must evolve and adopt supercloud to offer distributed solutions across cloud platforms and stretching out to the near and far edge.

Who is building superclouds?

Another question we often get is: Who has a supercloud and who is building a supercloud? Who are the contenders?

Well, most companies that consider themselves cloud players will, we believe, be building superclouds. Above is a common Enterprise Technology Research graphic we like to show with Net Score or spending momentum on the Y axis and Overlap or pervasiveness in the ETR surveys on the X axis. This is from the April survey of well over 1,000 chief executive officers and information technology buyers. And we’ve randomly chosen a number of players we think are in the supercloud mix and we’ve included the hyperscalers because they are the enablers.

We’ve added some of those nontraditional industry players we see building superclouds such as Capital One, Goldman Sachs and Walmart, in deference to Moschella’s observation about verticals. This goes back to every company being a software company. And rather than pattern-matching an outdated SaaS model we see a new industry structure emerging where software and data and tools specific to an industry will lead the next wave of innovation via the buildout of intelligent digital platforms.

We’ve talked a lot about Snowflake Inc.’s Data Cloud as an example of supercloud, as well as the momentum of Databricks Inc. (not shown above). VMware Inc. is clearly going after cross-cloud services. Basically every large company we see is either pursuing supercloud initiatives or thinking about it. Dell Technologies Inc., for example, showed Project Alpine at Dell Technologies World – that’s a supercloud in development. Snowflake introducing a new app dev capability based on its SuperPaaS (our term, of course, it doesn’t use the phrase), MongoDB Inc., Couchbase Inc., Nutanix Inc., Veeam Software, CrowdStrike Holdings Inc., Okta Inc. and Zscaler Inc. Even the likes of Cisco Systems Inc. and Hewlett Packard Enterprise Co., in our view, will be building superclouds.

Although ironically, as an aside, Fidelma Russo, HPE’s chief technology officer, said on theCUBE she wasn’t a fan of cloaking mechanisms. But when we spoke to HPE’s head of storage services, Omer Asad, we felt his team is clearly headed in a direction that we would consider supercloud. It could be semantics or it could be that parts of HPE are in a better position to execute on supercloud. Storage is an obvious starting point. The same can be said of Dell.

Listen to Fidelma Russo explain her aversion to building a manager of managers.

And we’re seeing emerging companies like Aviatrix Systems Inc. (network performance), Starburst Data Inc. (self-service analytics for distributed data), Clumio Inc. (data protection – not supercloud today but working on it) and others building versions of superclouds that solve a specific problem for their customers. And we’ve spoken to independent software vendors such as Adobe Systems Inc., Automatic Data Processing LLC and UiPath Inc., which are all looking at new ways to go beyond the SaaS model and add value within cloud ecosystems, in particular building data services that are unique to their value proposition and will run across clouds.

So yeah – pretty much every tech vendor with any size or momentum and new industry players are coming out of hiding and competing… building superclouds. Many that look a lot like Moschella’s matrix with machine intelligence and artificial intelligence and blockchains and virtual reality and gaming… all enabled by the internet and hyperscale clouds.

It’s moving fast and it’s the future, in our opinion, so don’t get too caught up in the past or you’ll be left behind.

What are some examples of superclouds?

We’ve given many in the past, but let’s try to be a bit more specific. Below we cite a few and we’ll answer two questions in one section here: What workloads and services will run in superclouds and what are some examples?

Analytics. Snowflake is the furthest along with its data cloud in our view. It’s a supercloud optimized for data sharing, governance, query performance, security, ecosystem enablement and ultimately monetization. Snowflake is now bringing in new data types and open-source tooling and it ticks the attribute boxes on supercloud we laid out earlier.

Converged databases. Running transaction and analytics workloads. Take a look at what Couchbase is doing with Capella and how it’s enabling stretching the cloud to the edge with Arm-based platforms and optimizing for low latency across clouds and out to the edge.

Document database workloads. Look at MongoDB – a developer-friendly platform that with Atlas is moving to a supercloud model running document databases very efficiently. Accommodating analytic workloads and creating a common developer experience across clouds.

Data science workloads. For example, Databricks is bringing a common experience for data scientists and data engineers driving machine intelligence into applications and fixing the broken data lake with the emergence of the lakehouse.

General-purpose workloads. For example, VMware’s domain. Very clearly there’s a need to create a common operating environment across clouds and on-prem and out to the edge and VMware is hard at work on that — managing and moving workloads, balancing workloads and being able to recover very quickly across clouds.

Network routing. This is the primary focus of Aviatrix, building what we consider a supercloud and optimizing network performance and automating security across clouds.

Industry-specific workloads. For example, Capital One announcing its cost optimization platform for Snowflake – piggybacking on Snowflake’s supercloud. We believe it’s going to test that concept outside its own organization and expand across other clouds as Snowflake grows its business beyond AWS. Walmart Inc. is working with Microsoft to create an on-prem to Azure experience – yes, that counts. We’ve written about what Goldman is doing and you can bet dollars to donuts that Oracle Corp. will be building a supercloud in healthcare with its Cerner acquisition.

Supercloud is everywhere you look. Sorry, naysayers. It’s happening.

What’s next from theCUBE?

With all the industry buzz and debate about the future, John Furrier and the team at SiliconANGLE have decided to host an event on supercloud. We’re motivated and inspired to further the conversation. TheCUBE on Supercloud is coming.

On Aug. 9 out of our Palo Alto studios we’ll be running a live program on the topic. We’ve reached out to a number of industry participants — VMware, Snowflake, Confluent, Sky High Security, Hashicorp, Cloudflare and Red Hat — to get the perspective of technologists building superclouds.

And we’ve invited a number of vertical industry participants in financial services, healthcare and retail that we’re excited to have on along with analysts, thought leaders and investors.

We’ll have more details in the coming weeks, but for now if you’re interested please reach out to us with how you think you can advance the discussion and we’ll see if we can fit you in.

So mark your calendars and stay tuned for more information.

Keep in touch

Thanks to Alex Myerson, who does the production, podcasts and media workflows for Breaking Analysis. Special thanks to Kristen Martin and Cheryl Knight, who help us keep our community informed and get the word out, and to Rob Hof, our editor in chief at SiliconANGLE.

Remember we publish each week on Wikibon and SiliconANGLE. These episodes are all available as podcasts wherever you listen.

Email david.vellante@siliconangle.com, DM @dvellante on Twitter and comment on our LinkedIn posts.

Also, check out this ETR Tutorial we created, which explains the spending methodology in more detail. Note: ETR is a separate company from Wikibon and SiliconANGLE. If you would like to cite or republish any of the company’s data, or inquire about its services, please contact ETR at legal@etr.ai.

Here’s the full video analysis:

All statements made regarding companies or securities are strictly beliefs, points of view and opinions held by SiliconANGLE media, Enterprise Technology Research, other guests on theCUBE and guest writers. Such statements are not recommendations by these individuals to buy, sell or hold any security. The content presented does not constitute investment advice and should not be used as the basis for any investment decision. You and only you are responsible for your investment decisions.

Disclosure: Many of the companies cited in Breaking Analysis are sponsors of theCUBE and/or clients of Wikibon. None of these firms or other companies have any editorial control over or advanced viewing of what’s published in Breaking Analysis.

Image: Rawpixel.com/Adobe Stock

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Sat, 09 Jul 2022 07:33:00 -0500 en-US text/html https://siliconangle.com/2022/07/09/answering-top-10-questions-supercloud/
Killexams : 4 Reasons To Buy IBM
Cebit Technology Fair

Sean Gallup/Getty Images News

Timing the Market subscribers had early access to this report.

International Business Machines (NYSE:IBM) reports its Q2 earnings soon.

ibm earnings

NASDAQ

Several years back, I looked into the quarterly patterns in IBM to find that Q2 is pretty much random. Now, with six more years of data - and with honing in specifically on holding over Q2 earnings - I've rerun the test to find different results. Put simply, from a statistical and seasonal perspective, buying IBM before its Q2 earnings report pays off:

ibm seasonal

Damon Verial

Even though we are holding over earnings, the max drawdown is 17%, which is only one-third as large as the buy-and-hold max drawdown, implying that holding over Q2 earnings - volatile as it may be - is actually relatively safe. The Sharpe ratio of this strategy concurs: At 42% higher than the Sharpe for buy-and-hold, simply holding IBM over Q2 earnings is a superior strategy from a risk/reward standpoint. Most interesting, however, is that Q2 earnings movements account for 65% of the upward trajectory of the stock.

It is natural to ask why Q2 would be special here. Based on EPS patterns, IBM is a rather cyclical stock. FQ2 is when IBM returns to upward momentum in EPS, after a reliably seasonal drop in FQ1 EPS:

ibm eps

Estimize

Q2's EPS performance is an important piece of novel information with implications for IBM's performance for the rest of the year. Notably, FQ2's real EPS tends to beat estimates and produce a significant surprise (read: stock reaction) 63% of the time, which not only helps explain the alpha produced by holding IBM over Q2 but is also a phenomenon that is reliably tradable.

Moreover, the risk/reward of this trade greatly favors the bulls. We saw above that the max drawdown of holding over Q2 is acceptable in respect to the max drawdown of buy-and-hold, but the reward, too, is bullish. The earnings movements for Q2 tend to be 85% larger for upward movements than downward movements.

Just the risk/reward of winning $1.85 for every $1.00 risk is enough to justify a trade if the probability of win is 50%. But recall that the probability of win is 63%, making this trade what we call a "phoenix" play in Timing the Market. That is, both the probability and risk/reward are in our favor, and this is the main reason I'm recommending this play.

Of course, it's also good to know what's going on with the company before diving into an earnings trade. We don't want to get snagged by a one-off event, such as a sudden rise in expenses or a business restructuring.

As for IBM, perhaps the biggest happening is simply economic in nature. The economy is seemingly headed toward a recession, and Fed tightening is putting further pressure on corporate earnings. This isn't necessarily bad for IBM, though, as the company is more of a recurring-revenue business - this is not a growth stock, and we will likely see some extra capital inflow into the stock as a defensive holding.

Arvind Krishna, CEO of IBM, stated in the last quarter's earnings call that "demand for technology is going to sit at 4 points to 5 points above GDP. Even if GDP falls to flat or there's a quick recession or if it's a very slight recession, we see demand staying strong and continuing." So, it appears that IBM is at least someone recession-proof. As the saying goes (for potential clients), "Nobody gets fired for buying IBM."

From a valuation perspective, IBM is slightly undervalued relative to its peers by several metrics, supporting the bullish thesis. For instance, IBM's price-to-sales is about two-thirds that of the industry average:

PS

Simply Wall St.

And its price-to-earnings is slightly below average:

PE

Simply Wall St.

From an absolute valuation perspective, IBM has about 20% upside:

valuation

Simply Wall St.

risk

Simply Wall St.

fcf

Simply Wall St.

terminal

Simply Wall St.

equity

Simply Wall St.

Overall, the data - including statistical, seasonal, macro, and valuation data - are highly supportive of a long position on IBM over Q2. Here is my trade idea:

  1. Buy Aug19 $140 call
  2. Sell Aug5 $155 call

The short calls are just to reduce the cost of the play; we reduce the cost by 14% by selling these short calls against the long call, at the detriment of capping our upside profit at $1500. However, we also get a positive theta from this, thereby profiting if IBM trends sideways due to the difference in time decay between the two options. Your max risk is merely the debit of the play, which is $535, at the time of writing.

Let me know if you have any questions.

Tue, 05 Jul 2022 03:28:00 -0500 en text/html https://seekingalpha.com/article/4521779-4-reasons-to-buy-ibm
Killexams : Embedded analytics emerges to offer new level of business intelligence

Business analytics is an increasingly powerful tool for organisations, but one that is associated with steep learning curves and significant investments in infrastructure.

The idea of using data to drive better decision-making is well established. But the conventional approach – centred around reporting and analysis tools – relies on specialist applications and highly trained staff. Often, firms find they have to build teams of data scientists to gather the data and manage the tools, and to build queries.

This creates bottlenecks in the flow of information, as business units rely on specialist teams to interrogate the data, and to report back. Even though reporting tools have improved dramatically over the past decade, with a move from spreadsheets to visual dashboards, there is still too much distance between the data and the decision-makers.

Companies and organistions also face dealing with myriad data sources. A study from IDC found that close to four in five firms used more than 100 data sources and just under one-third had more than 1,000. Often, this data exists in silos.

As a result, suppliers have developed embedded analytics to bring users closer to the data and, hopefully, lead to faster and more accurate decision-making. Suppliers in the space include ThoughtSpot, Qlik and Tableau, but business intelligence (BI) and data stalwarts such as Informatica, SAS, IBM and Microsoft also have relevant capabilities.

Embedded analytics adds functionality into existing enterprise software and web applications. That way, users no longer need to swap into another application – typically a dashboard or even a BI tool itself – to look at data. Instead, analytics suppliers provide application programming interfaces (APIs) to link their tools to the host application.

Embedded analytics can be used to provide mobile and remote workers access to decision support information, and even potentially data, on the move. This goes beyond simple alerting tools: systems with embedded analytics built in allow users to see visualisations and to drill down into live data.

And the technology is even being used to provide context-aware information to consumers. Google, for example, uses analytics to present information about how busy a location or service will be, based on variables such as the time of day.

Indeed, some suppliers describe embedded analytics as a “Google for business” because it allows users to access data without technical know-how or an understanding of analytical queries.

“My definition generally is having analytics available in the system,” says Adam Mayer, technical product director at Qlik. “That’s not your dedicated kind of BI tool, but more to the point, I think it’s when you don’t realise that you’re analysing data. It’s just there.”

The trend towards embedding analytics into other applications or web services reflects the reality that there are many more people in enterprises who could benefit from the insights offered by BI than there are users of conventional BI systems.

Firms also want to Boost their return on investment in data collection and storage by giving more of the business access to the information they hold. And with the growth of machine learning and artificial intelligence (AI), some of the heavy lifting associated with querying data is being automated.

“What we are trying to do is provide non-technical users the ability to engage with data,” says Damien Brophy, VP for Europe, the Middle East and Africa (EMEA) at ThoughtSpot. “We’re bringing that consumer-like, Google-like experience to enterprise data. It is giving thousands of people access to data, as opposed to five or 10 analysts in the business who then produce content for the rest of the business.”

At one level, embedded analytics stands to replace static reports and potentially dashboards too, without the need to switch applications. That way, an HR or supply chain specialist can view and – to a degree – query data from within their HR or enterprise resource planning (ERP) system, for example.

A field service engineer could use an embedded analysis module within a maintenance application to run basic “what if” queries, to check whether it is better to replace a part now or carry out a minor repair and do a full replacement later.

Embedded analytics to help decision-making

Also, customer service agents are using embedded analytics to help with decision-making and to tailor offers to customers.

Embedded systems are designed to work with live data and even data streams, even where users do not need to drill down into the data. Enterprises are likely to use the same data to drive multiple analysis tools: the analytics, business development or finance teams will use their own tools to carry out complex queries, and a field service or customer service agent might need little more than a red or green traffic light on their screen.

“The basic idea is that every time your traditional reporting process finds the root cause of a business problem, you train your software, either by formal if-then-else rules or via machine learning, to alert you the next time a similar situation is about to arise,” says Duncan Jones, VP and principal analyst at Forrester.

“For instance, suppose you need to investigate suppliers that are late delivering important items. In the old approach, you would create reports about supplier performance, with on-time-delivery KPI and trends and you’d pore through it looking for poor performers.

“The new approach is to create that as a view within your home screen or dashboard, continually alerting you to the worst performers or rapidly deteriorating ones, and triggering a formal workflow for you to record the actions you’ve taken – such as to contact that supplier to find out what it is doing to fix its problems.”

This type of alerting helps businesses, because it speeds up the decision-making process by providing better access to data that the organisation already holds.

“It’s partly businesses’ need to move faster, to react more quickly to issues,” says Jones. “It’s also evolution of the technology to make embedded alert-up analytics easier to deliver.”

Embedded analytics suppliers are also taking advantage of the trend for businesses to store more of their data in the cloud, making it easier to link to multiple applications via APIs. Some are going a step further and offering analytical services too: a firm might no longer need expertise in BI, as the supplier can offer its own analytical capabilities.

Again, this could be via the cloud, but serving the results back to the users in their own application. And it could even go further by allowing different users to analyse data in their own workflow-native applications.

A “smart” medical device, such as an asthma inhaler, could provide an individual’s clinical data to their doctor, but anonymised and aggregated data to the manufacturer to allow them to plan drug manufacturing capacity better.

“Data now is changing so quickly, you really need intraday reporting,” says Lee Howells, an analytics specialist at PA Consulting. “If we can put that in on a portal and allow people to see it as it happened, or interact with it, they are then able to drill down on it.

“It’s putting that data where employees can use it and those employees can be anyone from the CEO to people on operations.”

But if the advantage of embedded analytics lies in its ability to tailor data to the users’ roles and day-to-day applications, it still relies on the fundamentals of robust BI systems.

Firms considering embedded analytics need to look at data quality, data protection and data governance.

They also need to pay attention to security and privacy: the central data warehouse or data lake might have robust security controls, but does the application connecting via an API? Client software embedding the data should have equal security levels.

Cleaner data is critical

And, although cleaning data is always important for effective analytics and business intelligence, it becomes all the more critical when the users are not data scientists. They need to know that they can trust the data, and if the data is imperfect or incomplete, this needs to be flagged.

A data scientist working on an analytics team will have an instinctive feel for data quality and reliability, and will understand that data need not be 100% complete to Boost decision-making. But a user in the field, or a senior manager, might not.

“Embedded analytics continues the democratisation of data, bringing data and insight directly to the business user within their natural workflow,” says Greg Hanson, VP for EMEA at Informatica.

“This fosters a culture of data-driven decision-making and can speed time to value. However, for CDOs [chief data officers] and CIOs, the crucial question must be: ‘is it accurate, is it trustworthy and can I rely on it?’ For embedded analytics programmes to be a success, organisations need confidence that the data fuelling them is from the right sources, is high quality and the lineage is understood.”

CDOs should also consider starting small and scaling up. The usefulness of real-time data will vary from workflow to workflow. Some suppliers’ APIs will integrate better with the host application than others. And users will need time to become comfortable making decisions based on the data they see, but also to develop a feel for when questions are better passed on to the analytics or data science team.

“Organisations, as part of their next step forward, have come to us with their cloud infrastructure or data lakes already in place, and they started to transform their data engineering into something that can be used,” says PA’s Howell. “Sometimes they put several small use cases in place as proof of concept and the proof of value. Some data isn’t as well used as it could be. I think that’s going to be a continually evolving capability.”

Sun, 10 Jul 2022 17:00:00 -0500 en text/html https://www.computerweekly.com/feature/Embedded-analytics-emerges-to-offer-new-level-of-business-intelligence
Killexams : The Retiree's Dividend Portfolio - Jane's June Update: Record Dividends
Oil Refinery, Chemical & Petrochemical Plant

zorazhuang

Background

For those who are interested in John and Jane's full background, please click the following link here for the last time I published their full story. The details below are updated for 2022.

  • This is a real portfolio with real shares being traded.
  • I am not a financial advisor and merely provide guidance based on a relationship that goes back several years.
  • John retired in January 2018 and now only collects Social Security income as his regular source of income.
  • Jane officially retired at the beginning of 2021, and she is collecting Social Security as her only regular source of income.
  • John and Jane have decided to start taking draws from the Taxable Account and John's Traditional IRA to the tune of $1,000/month each. These draws are currently covered in full by the dividends generated in each account.
  • John and Jane have other investments outside of what I manage. These investments primarily consist of minimal-risk bonds and low-yield certificates.
  • John and Jane have no debt and no monthly payments other than basic recurring bills such as water, power, property taxes, etc.

I started helping John and Jane with their retirement accounts because I was infuriated by the fees their previous financial advisor was charging them. I do not charge John and Jane for anything that I do, and all I have asked of them is that they allow me to write about their portfolio anonymously in order to help spread knowledge and to make me a better investor in the process.

Generating a stable and growing dividend income is the primary focus of this portfolio, and capital appreciation is the least important characteristic. My primary goal was to provide John and Jane as much certainty in their retirement as I possibly can because this has been a constant point of stress over the last decade.

Dividend Decreases

No stocks in Jane's Traditional or Roth IRA paid a decreased dividend during the month of June.

Dividend And Distribution Increases

Three companies paid increased dividends/distributions or a special dividend during the month of June in the Traditional and Roth IRAs.

  • International Business Machines (IBM)
  • LyondellBasell (LYB)
  • Main Street Capital (MAIN).

International Business Machines

IBM continues to be the dividend stock that investors love to hate. For years the concern has been a slow but steady drop off in revenue which has resulted in pressure on corporate earnings and ultimately limited the ability to grow its dividend. The most recent increase is a perfect example of the problem that this has created with the average three-year dividend growth rate coming in at less than 2.5% while the 10-year average dividend growth rate comes in at 8.17%. This is a problem for a tech company like IBM which is why it currently yields a whopping 4.61% and explains why the share price has been stagnant for so long. The acquisition of Red Hat ("RHT") appears to have given IBM a new sense of relevance in the hybrid cloud platform. Another positive is that the company has been able to deleverage since the acquisition of RHT with debt levels closing in on the same level prior to the RHT acquisition.

We have sold shares of IBM at $140/share and higher over the last year but view stock as a buy under $130/share (I prefer under $125/share). With the current position carrying an average cost basis of $122/share, we do not plan on selling any shares in the near future.

Chart
Data by YCharts

The dividend was increased from $1.64/share per quarter to $1.65/share per quarter. This represents an increase of .6% and a new full-year payout of $6.60/share compared with the previous $6.56/share. This results in a current yield of 4.61%% based on the current share price of $139.18.

LyondellBasell

It’s not every day that a company raises its dividend and offers a massive special dividend payout at the same time. The awesome announcement was accompanied by the following statement:

"LyondellBasell established new records for cash generation in 2021 and we have a strong outlook for our company. Capital returns have always been an important component of LyondellBasell's value proposition for shareholders. 2022 will mark our 12th consecutive year of regular dividend growth. The combination of today's special and quarterly dividends returns $2.1 billion to shareholders. As the incoming CEO, I would like to make it very clear that I support the continuation of our balanced and disciplined capital allocation strategy with both dividends and share repurchases playing a central role."

We sold shares prior to the dividend announcement as the stock pushed its 52-week-high. The 25 shares we sold were at $108.35/share and were used to reduce the exposure the position had to high-cost shares that had been purchased at around $115/share. We have since added 20 shares back at a major discount and plan to add more. Analyst downgrades have been common in the news but I see a Strong Buy under $90/share and enjoy locking in the 5%+ yield in the meantime.

LyondellBasell - FastGraphs - July

LyondellBasell - FastGraphs - July (FastGraphs)

The dividend was increased from $1.13/share per quarter to $1.19/share per quarter. This represents an increase of 5.3% and a new full-year payout of $4.76/share compared with the previous $4.52/share. This results in a current yield of 5.29% based on the current share price of $85.93.

LYB paid a special dividend of $5.20/share which was paid on June 13th, 2022.

Main Street Capital

Q2-2022 earnings will be coming out in less than a month and I expect it will demonstrate many of the strengths that made Q1-2022 push record levels in multiple metrics. Q1-2022 recorded interest income of $59.4 million compared to $43.5 million in Q1-2021 and we expect this number to continue improving due to the fact that most of MAIN’s portfolio is variable rate and therefore increases its income when the Federal Reserve raises rates. Another important indicator is the net asset value per share of $25.89/share and is up from $25.59/share in the previous quarter. MAIN’s management is top-notch and has always been consistently shareholder friendly and focused on long-term results.

Although the NAV continues to climb, shares are not cheap by any means. The recent pullback into the $34/share range represented a buying opportunity and we nibbled a little too early when it dropped below $40/share. We a hesitant to add too much more exposure to MAIN so we will be looking for a price under $35/share. For those who prefer to follow the dividend yield metric I would say a yield close to 7% would be the best/most opportunistic entry point. Other than COVID, this does not happen often so buyers need to be prepared to act when the opportunity arises.

Chart
Data by YCharts

MAIN paid a special dividend of $.075/share which was paid on June 30th, 2022.

Retirement Account Positions

There are currently 39 different positions in Jane's Traditional IRA and 23 different positions in Jane's Roth IRA. While this may seem like a lot, it is important to remember that many of these stocks cross over in both accounts and are also held in the Taxable Portfolio.

Below is a list of the trades that took place in the Traditional IRA during the month of June.

Traditional IRA - June - Trades

Traditional IRA - June - Trades (Charles Schwab)

Below is a list of the trades that took place in the Roth IRA during the month of June.

Roth IRA - June - Trades

Roth IRA - June - Trades (Charles Schwab)

Agree Realty Preferred Series A

This awesome monthly dividend payer has a current share price that is too high for us to consider adding more. I really like Agree Realty's (ADC) portfolio but again its share price is too high to justify adding common shares at this point in time. Funny enough, the reason that I found out about the company’s preferred shares was due to comments that was left on a previous portfolio update for John’s retirement accounts. At a PAR price of $25, ADC.PRA trades at a yield of 4.25% which isn’t compelling in the current rate environment at the time of purchase shares, we were able to buy all portions of the position for less than $18/share or a yield close to 6%. Additionally, if the shares are held to term and they are called for the PAR price of $25 this will result in a gain of seven dollars/share or a total of $700 in capital gains. We plan to continue adding to this position as long as shares remain attractive.

Alexandria Realty

Alexandria Realty (ARE) is another new position in Jane’s Traditional IRA that was entered into at $136/share and is off its high of $225/share in January 2022. ARE’s 10-year average P/AFFO is approximately 25.5X and currently trades at a P/AFFO of 23.2X. The last time ARE treated at a discount to its average P/AFFO was during COVID and then for only a brief period of time at the end of 2018/early 2019. For those looking for a compelling article reviewing ARE’s situation I would recommend memorizing Dane Bowler’s article Alexandria Is Life Science Growth At An Office Discount.

Alexandria Real Estate - July

Alexandria Real Estate - July (FastGraphs)

Kyndryl Holdings

We originally held on to Kyndryl Holdings (KD) after it was spun off from IBM. Simply put, the stock has performed terribly and with a whopping total of 18 shares we felt it was time to say goodbye to this company. KD does not provide any dividends and with its speculative growth potential it doesn’t have a place in Jane’s portfolio over the long-term.

Chart
Data by YCharts

Lexington Preferred Series C

LXP.PC typically trades above its PAR value of $50/share. Whenever the stock drops to (or in some cases below) $50/share I try to purchase some because it is a solid income investment with a 6.5% yield. These shares are what we refer to as non-callable preferred shares which provide all the benefits of preferred stock with no set redemption date. The price of these shares have been steady even when LXP’s business model was in question (the company has made a significant transition over the last five years and now focuses on industrial real estate).

If anyone has questions about the other traits that took place in either of the Traditional IRA or Roth IRA feel free to ask in the comment section and I will be happy to discuss those trades.

June Income Tracker - 2021 Vs. 2022

Income for the month of June was up significantly year-over-year for Jane's Traditional IRA and up considerably for the Roth IRA. The average monthly income for the Traditional IRA in 2022 is expected to be up about 11.3% based on current estimates (this is up from 5.3% in May due to LYB's special dividend) and the Roth IRA is looking to grow by 5.3%. This means the Traditional IRA would generate an average monthly income of $1,543.26/month and the Roth IRA would generate an average income of $623.97/month. This compares with 2021 figures that were $1,386.13/month and $592.61/month, respectively.

SNLH = Stocks No Longer Held - Dividends in this row represent the dividends collected on stocks that are no longer held in that portfolio. We still count the dividend income that comes from stocks no longer held in the portfolio even though it is non-recurring.

All images below come from Consistent Dividend Investor, LLC. (Abbreviated to CDI).

Traditional IRA - 2021 V 2022 - June Dividends

Traditional IRA - 2021 V 2022 - June Dividends (CDI)

Roth IRA - 2021 V 2022 - June Dividends

Roth IRA - 2021 V 2022 - June Dividends (CDI)

Here is a graphical illustration of the dividends received on a monthly basis for the Traditional and Roth IRAs.

Retirement Projections - 2022 - June - Monthly Dividends (Bar Graph)

Retirement Projections - 2022 - June - Monthly Dividends (Bar Graph) (CDI)

The table below represents the real full-year results for 2022 and the prior year.

Retirement Projections - 2022 - June

Retirement Projections - 2022 - June (CDI)

Below is an expanded table that shows the full dividend history since inception for both the Traditional IRA and Roth IRA.

Retirement Projections - 2022 - June - 5 YR History

Retirement Projections - 2022 - June - 5 YR History (CDI)

I have included line graphs that better represent the trends associated with Jane's monthly dividend income generated by her retirement accounts. The images below represent the Traditional IRA and Roth IRA, respectively.

Retirement Projections - 2022 - June - Monthly Dividends

Retirement Projections - 2022 - June - Monthly Dividends (CDI)

Here is a table to show how the account balances stack up year over year (I previously used a graph but believe the table is more informative).

It is worth noting that with John and Jane Retired, there will be no additional contributions to these accounts. In fact, they have already begun to take regular distributions from the Taxable Account and John's Traditional IRA.

Retirement Account Balances - 2022 - June

Retirement Account Balances - 2022 - June (CDI)

The next images are the tables that indicate how much cash Jane had in her Traditional and Roth IRA Accounts at the end of the month as indicated on their Charles Schwab statements.

Retirement Projections - 2022 - June - Cash Balances

Retirement Projections - 2022 - June - Cash Balances (CDI)

The next image provides a history of the unrealized gain/loss at the end of each month in the Traditional and Roth IRAs going back to the beginning in January of 2018.

Retirement Projections - 2022 - June - Unrealized Gain-Loss

Retirement Projections - 2022 - June - Unrealized Gain-Loss (CDI)

I like to show readers the real unrealized gain/loss associated with each position in the portfolio because it is important to consider that in order to become a proper dividend investor, it is necessary to learn how to live with volatility. The market value and cost basis below are accurate at the market close on July 13th.

Here is the unrealized gain/loss associated with Jane's Traditional and Roth IRAs.

Traditional IRA - 2022 - June - Gain-Loss

Traditional IRA - 2022 - June - Gain-Loss (CDI)

Roth IRA - 2022 - June - Gain-Loss

Roth IRA - 2022 - June - Gain-Loss (CDI)

The last two graphs show how dividend income has increased, stayed the same, or decreased in each respective month on an annualized basis. I believe that the graph will continue to become more valuable as more years of data become available (with the fifth year of data being added, we can really see the trajectory of the income change for each month).

Traditional IRA - 2022 - June - Monthly Year-Over-Year Comparison

Traditional IRA - 2022 - June - Monthly Year-Over-Year Comparison (CDI)

Roth IRA - 2022 - June - Monthly Year-Over-Year Comparison

Roth IRA - 2022 - June - Monthly Year-Over-Year Comparison (CDI)

Conclusion

June was a rough month for account balances but the special dividends and increases were more than enough to compensate for this temporary drop in account value. In addition to this, readers can see a significant amount of trades which has allowed us to rotate capital from certain sectors and reduce exposure to certain positions while building positions in other positions that we consider to be undervalued.

June Articles

I have provided the link to the June 2022 Taxable Account below.

The Retirees' Dividend Portfolio: John And Jane's June Taxable Account Update

In Jane's Traditional and Roth IRAs, she is currently long the following mentioned in this article: AbbVie (NYSE:ABBV), Agree Realty (NYSE:ADC), Agree Realty Preferred Series A (ADC.PRA), Archer-Daniels-Midland (NYSE:ADM), Broadcom (NASDAQ:AVGO), Avient (NYSE:AVNT), Broadcom Preferred Series A (NASDAQ:AVGOP), Boeing (NYSE:BA), Bank of America (NYSE:BAC), Black Hills Corp. (NYSE:BKH), BlackRock Health Sciences Trust (NYSE:BME), Bank of Montreal (NYSE:BMO), Bank of Nova Scotia (NYSE:BNS), BP (NYSE:BP), British American Tobacco (NYSE:BTI), Canadian Imperial Bank of Commerce (NYSE:CM), Cummins (NYSE:CMI), Concentrix (NASDAQ:CNXC), Digital Realty (NYSE:DLR), Eaton Vance Floating-Rate Advantage Fund A (MUTF:EAFAX), Enbridge (NYSE:ENB), EPR Properties Preferred Series E (NYSE:EPR.PE), Eaton Corporation (NYSE:ETN), Emera Inc. (OTCPK:EMRAF), East West Bancorp (NASDAQ:EWBC), General Mills (NYSE:GIS), GasLog Partners Preferred C (NYSE:GLOP.PC), Honeywell (NASDAQ:HON), International Business Machines (NYSE:IBM), Iron Mountain (NYSE:IRM), Lexington Realty Preferred Series C (NYSE:LXP.PC), Lumen Technologies (NYSE:LUMN), LyondellBasell (NYSE:LYB), Main Street Capital (NYSE:MAIN), McGrath RentCorp (NASDAQ:MGRC), 3M (NYSE:MMM), Altria (NYSE:MO), Annaly Capital Preferred Series G (NYSE:NLY.PG), NextEra Energy (NYSE:NEE), NetApp (NASDAQ:NTAP), Realty Income (NYSE:O), OGE Energy Corp. (NYSE:OGE), Oxford Lane Capital Corp. 6.75% Cum Red Pdf Shares Series 2024 (NASDAQ:OXLCM), Philip Morris (NYSE:PM), PPG Industries (NYSE:PPG), PIMCO Corporate & Income Opportunity Fund (PTY), Cohen & Steers REIT & Preferred Income Fund (NYSE:RNP), Royal Bank of Canada (NYSE:RY), TD SYNNEX Corp. (NYSE:SNX), STORE Capital (NYSE:STOR), Toronto-Dominion Bank (NYSE:TD), Unilever (NYSE:UL), UMH Properties (UMH), Verizon (NYSE:VZ), Williams Companies (NYSE:WMB), W. P. Carey (NYSE:WPC).

Sat, 16 Jul 2022 01:00:00 -0500 en text/html https://seekingalpha.com/article/4523372-the-retirees-dividend-portfolio-janes-june-update-record-dividends
Killexams : Trickbot may be carrying water for Russia No result found, try new keyword!In a report out this morning, IBM security researchers say that Trickbot, one of the most active ransomware distributors of the past several years, has hit targets inside Ukraine in six separate ... Wed, 06 Jul 2022 23:28:37 -0500 en-us text/html https://www.msn.com/en-us/news/politics/trickbot-may-be-carrying-water-for-russia/ar-AAZjb6R Killexams : Microsoft Drops Emotion Recognition as Facial Analysis Concerns Grow

Despite facial recognition technology’s potential, it faces mounting ethical questions and issues of bias.

To address those concerns, Microsoft recently released its Responsible AI Standard and made a number of changes, the most noteworthy of which is to retire the company’s emotional recognition AI technology.

Responsible AI

Microsoft’s new policy contains a number of major announcements.

  • New customers must apply for access to use facial recognition operations in Azure Face API, Computer Vision and Video Indexer, and existing customers have one year to apply and be approved for continued access to the facial recognition services.
  • Microsoft’s policy of Limited Access adds use case and customer eligibility requirements to access the services.
  • Facial detection capabilities—including detecting blur, exposure, glasses, head pose, landmarks, noise, occlusion, and facial bounding box—will remain generally available and do not require an application.

The centerpiece of the announcement is that the software giant “will retire facial analysis capabilities that purport to infer emotional states and identity attributes such as gender, age, smile, facial hair, hair, and makeup.”

Microsoft noted that “the inability to generalize the linkage between facial expression and emotional state across use cases, regions, and demographics…opens up a wide range of ways they can be misused—including subjecting people to stereotyping, discrimination, or unfair denial of services.”

Also read: AI Suffers from Bias—But It Doesn’t Have To

Moving Away from Facial Analysis

There are a number of reasons why major IT players have been moving away from facial recognition technologies, including limiting law enforcement access to the technology.

Fairness concerns

Automated facial analysis and facial recognition software have always generated controversy. Combine this with the often inherent societal biases of AI systems and the potential to exacerbate issues of bias intensifies. Many commercial facial analysis systems today inadvertently exhibit bias in categories such as race, age, culture, ethnicity and gender. Microsoft’s Responsible AI Standard implementation aims to help the company get ahead of potential issues of bias through its outlined Fairness Goals and Requirements.

Appropriate use controls

Regardless of Azure AI Custom Neural Voice’s boundless potential in entertainment, accessibility and education, it could also be greatly misused to deceive listeners by impersonating speakers. Microsoft’s Responsible AI program, plus the Sensitive Users review process essential to the Responsible AI Standard, reviewed its Facial Recognition and Custom Neural Voice technologies to develop a layered control framework. By limiting these technologies and implementing these controls, Microsoft hopes to safeguard the technologies and users from misuse while ensuring that their implementations are of value.

Lack of consensus on emotions

Microsoft’s decision to do away with public access to the emotion recognition and facial characteristics identification features of its AI is due to the lack of a distinct consensus on the definition of emotions. Experts from within and outside the company have pointed out the effect of this lack of consensus on emotion recognition technology products, as they generalize inferences across demographics, regions and use cases. This hinders the ability of the technology to provide appropriate solutions to the problems it aims to solve and ultimately impacts its trustworthiness.

The skepticism associated with the technology comes from its disputed efficacy and justification for its use. Human rights groups contend that emotion AI is discriminatory and manipulative. One study found that emotion AI constantly identified White subjects to have more positive emotions than Black subjects across two different facial recognition software platforms.

Intensifying privacy concerns

There is increasing scrutiny of facial recognition technologies and their unethical use for public surveillance and mass face detection without consent. Even though facial analysis collects generic data that is kept anonymous—such as Azure Face’s service that infers identity attributes like gender, hair, age, and more—anonymization does not alleviate ever-growing privacy concerns. Aside from consenting to such technologies, subjects may often harbor concerns about how the data collected by these technologies is stored, protected and used.

Also read: What Does Explainable AI Mean for Your Business?

Facial Detection and Bias

Algorithmic bias sees machine learning algorithms portray the biases of either their creators or their input data. The large-scale usage of these models in our technology-dependent lives means that their use cases are at risk of adopting and proliferating mass-produced biases.

Facial detection technologies struggle to produce accurate results in use cases involving women, dark-skinned people and older adults, as it is common to find these technologies being trained by facial image datasets dominated by Caucasian subjects. Bias in facial analysis and facial recognition technologies yields real-life consequences, such as the following examples.

Inaccuracy

Regardless of the strides that facial detection technologies have taken, bias often yields inaccurate results. Studies show that face detection technologies generally perform better with lighter skin complexions. One study reports findings of the identification of lighter-skinned males having a maximum error rate of 0.8% compared to up to 34.7% for dark-skinned women.

The failures in recognizing the faces of dark-skinned people have led to instances where the technology has been used wrongly by law enforcement. In February 2019, a Black man was accused of not only shoplifting but also attempting to hit a police officer with a car even though he was forty miles away from the scene of the crime at the time. He spent 10 days in jail and his defense cost him $5,000.

Since the case was dismissed for lack of evidence in November 2019, the man is suing the authorities involved for false arrest, imprisonment and civil rights violation. In a similar case, another man was wrongfully arrested as a result of inaccuracy in facial recognition. Such inaccuracies raise concerns about how many wrongful arrests and convictions may have taken place.

Several vendors of the technology, such as IBM, Amazon, and Microsoft, are aware of such limitations in areas like law enforcement and the implication of the technology for racial injustice and have moved to prevent potential misuse of their software. Microsoft’s policy prohibits the use of its Azure Face by or for state police in the United States.

Decision making

It is not uncommon to find facial analysis technology being used to assist in the evaluation of video interviews with job candidates. These tools influence recruiters’ hiring decisions using data they generate by analyzing facial expressions, movements, choice of words, and vocal tone. Such use cases are meant to lower hiring costs and increase efficiency by expediting the screening and recruitment of new hires.

However, failure to train such algorithms on datasets that are both large enough and diverse enough introduces bias. Such bias may deem certain people to be more suitable for employment than others. False positives or negatives may be the determinants of the employment of an unsuitable candidate as well as the rejection of the most suitable one. As long as they contain bias, the same results will likely be experienced in any similar context where the technology is used to make decisions based on people’s faces.

What’s Next for Facial Analysis?

All of this doesn’t mean that Microsoft is discarding its facial analysis and recognition technology entirely, as the company recognizes that these features and capabilities can yield value in controlled accessibility contexts. Microsoft’s biometric systems such as facial recognition will be limited to partners and customers of managed services. The availability of facial analysis will continue to be available to users until June 30, 2023, via the Limited Access arrangement.

Limited Access only applies to users working directly with the Microsoft accounts team. Microsoft has provided a list of approved Limited Access use cases here. Users have until then to submit applications for approval to continue using the technology. Such systems will also be limited to use cases that are deemed acceptable. Additionally, a code of conduct and guardrails will be used to ensure authorized users do not misuse the technology.

The Computer Vision and Video Indexer celebrity recognition features are also subject to Limited Access. Video Indexer’s face identification also falls under Limited. Customers will no longer have general access to facial recognition from these two services, in addition to Azure Face API.

As a result of its review, Microsoft announced, “We are undertaking responsible data collections to identify and mitigate disparities in the performance of the technology across demographic groups and assessing ways to present this information in a way that would be insightful and actionable for our customers.”

Read next: Best Machine Learning Software

Tue, 05 Jul 2022 11:38:00 -0500 en-US text/html https://www.itbusinessedge.com/business-intelligence/microsoft-drops-emotion-recognition-facial-analysis/
Killexams : Global IIoT Market Report 2022, Featuring Profiles of IBM, Intel Schneider Electric, General Electric, Emerson Electric and Texas Instruments

Company Logo

Dublin, July 18, 2022 (GLOBE NEWSWIRE) -- The "Industrial Internet of Things Market - Global Industry Analysis, Size, Share, Growth, Trends, and Forecast, 2021-2031" report has been added to ResearchAndMarkets.com's offering.

The study presents detailed information about the important growth factors, restraints, and key trends that are creating the landscape for the future growth of the Industrial Internet of Things (IIoT) market, to identify the opportunistic avenues of the business potential for stakeholders.

The report also provides insightful information about how the Industrial Internet of Things (IIoT) market is expected to progress during the forecast period 2021-2031.

The report offers intricate dynamics about the different aspects of the IIoT market that aids companies operating in the market in making strategic development decisions. This study elaborates on the significant changes that are highly anticipated to configure the growth of the IIoT market during the forecast period. It also includes a key indicator assessment to highlight the growth prospects of the IIoT market, and estimate statistics related to the market progress in terms of value (US$ Bn).

The study provides detailed segmentation of the IIoT market, along with country analysis, key information, and a competitive outlook. The report mentions the company profiles of key players that are currently dominating the IIoT market, wherein various developments, expansion, and winning strategies practiced and executed by leading players have been presented in detail.

Key Questions Answered:

  • Which regions will continue to remain the most profitable regional markets for Industrial Internet of Things (IIoT) market players?

  • Which factors will induce a change in demand for Industrial Internet of Things (IIoT) during the assessment period?

  • How will changing trends impact the Industrial Internet of Things (IIoT) market?

  • How will COVID-19 impact the Industrial Internet of Things (IIoT) market?

  • How can market players capture low-hanging opportunities in the Industrial Internet of Things (IIoT) market in developed regions?

  • Which companies are leading the Industrial Internet of Things (IIoT) market?

  • What are the winning strategies of stakeholders in the Industrial Internet of Things (IIoT) market to upscale their position in this landscape?

  • What will be the Y-o-Y growth of the Industrial Internet of Things (IIoT) market between 2021 and 2031

  • What are the winning imperatives of market frontrunners in the Industrial Internet of Things (IIoT) market?

Key syllabus Covered:

1. Preface
1.1. Market Introduction
1.2. Market Segmentation
1.3. Key Research Objectives

2. Assumptions and Research Methodology
2.1. Research Methodology
2.2. Key Assumptions for Data Modelling

3. Executive Summary: Global Industrial Internet of Things (IIoT) Market

4. Market Overview
4.1. Market Definition
4.2. Technology/ Product Roadmap
4.3. Market Factor Analysis
4.4. COVID-19 Impact Analysis
4.5. Market Opportunity Assessment - by Region (North America/ Europe/ Asia Pacific/ Middle East & Africa/ South America)

5. Global Industrial Internet of Things (IIoT) Market Analysis and Forecast
5.1. Market Revenue Analysis (US$ Bn), 2016-2031
5.2. Pricing Model Analysis/ Price Trend Analysis

6. Global Industrial Internet of Things (IIoT) Market Analysis, by Component
6.1. Overview and Definitions
6.2. Key Segment Analysis
6.3. Industrial Internet of Things (IIoT) Market Size (US$ Bn) Forecast, by Component, 2018 - 2031

7. Global Industrial Internet of Things (IIoT) Market Analysis, by Industry
7.1. Overview and Definitions
7.2. Key Segment Analysis
7.3. Industrial Internet of Things (IIoT) Market Size (US$ Bn) Forecast, by Industry, 2018 - 2031

8. Global Industrial Internet of Things (IIoT) Market Analysis and Forecasts, by Region
8.1. Key Findings
8.2. Market Size (US$ Bn) Forecast by Region, 2018-2031

9. North America Industrial Internet of Things (IIoT) Market Analysis and Forecast

10. Europe Industrial Internet of Things (IIoT) Market Analysis and Forecast

11. Asia Pacific Industrial Internet of Things (IIoT) Market Analysis and Forecast

12. Middle East & Africa Industrial Internet of Things (IIoT) Market Analysis and Forecast

13. South America Industrial Internet of Things (IIoT) Market Analysis and Forecast

14. Competition Landscape
14.1. Market Competition Matrix, by Leading Players
14.2. Market Revenue Share Analysis (%), by Leading Players (2020)
14.3. Competitive Scenario

15. Company Profiles

  • IBM

  • Intel

  • Schneider Electric SE

  • General Electric Company

  • Emerson Electric

  • ABB Ltd.

  • Accenture PLC

  • Tech Mahindra

  • Softweb Solutions

  • Softweb Solutions

  • ZIH Corp.

  • Siemens AG

  • Robert Bosch GmbH

  • NEC

  • Kuka AG

  • Huawei Technology

  • Dassault Systemes

  • Texas Instruments

For more information about this report visit https://www.researchandmarkets.com/r/vjcyx7

About ResearchAndMarkets.com
ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

CONTACT: CONTACT: ResearchAndMarkets.com Laura Wood, Senior Press Manager press@researchandmarkets.com For E.S.T Office Hours Call 1-917-300-0470 For U.S./CAN Toll Free Call 1-800-526-8630 For GMT Office Hours Call +353-1-416-8900
Sun, 17 Jul 2022 20:23:00 -0500 en-AU text/html https://au.sports.yahoo.com/global-iiot-market-report-2022-082300198.html
Killexams : Bobidi launches to reward developers for testing companies' AI models

In the rush to build, test and deploy AI systems, businesses often lack the resources and time to fully validate their systems and ensure they're bug-free. In a 2018 report, Gartner predicted that 85% of AI projects will deliver erroneous outcomes due to bias in data, algorithms or the teams responsible for managing them. Even Big Tech companies aren't immune to the pitfalls — for one client, IBM ultimately failed to deliver an AI-powered cancer diagnostics system that wound up costing $62 million over 4 years.

Inspired by "bug bounty" programs, Jeong-Suh Choi and Soohyun Bae founded Bobidi, a platform aimed at helping companies validate their AI systems by exposing the systems to the global data science community. With Bobidi, Bae and Choi sought to build a product that lets customers connect AI systems with the bug-hunting community in a "secure" way, via an API.

The idea is to let developers test AI systems and biases — that is, the edge cases where the systems perform poorly — to reduce the time needed for validation, Choi explained in an email interview. Bae was previously a senior engineer at Google and led augmented reality mapping at Niantic, while Choi was a senior manager at eBay and headed the "people engineering" team at Facebook. The two met at a tech industry function about 10 years ago.

"By the time bias or flaws are revealed from the model, the damage is already irrevocable," Choi said. "For example, natural language processing algorithms [like OpenAI's GPT-3] are often found to be making problematic comments, or mis-responding to those comments, related to hate speech, discrimination, and insults. Using Bobidi, the community can 'pre-test' the algorithm and find those loopholes, which is actually very powerful as you can test the algorithm with a lot of people under certain conditions that represent social and political contexts that change constantly."

To test models, the Bobidi "community" of developers builds a validation dataset for a given system. As developers attempt to find loopholes in the system, customers get an analysis that includes patterns of false negatives and positives and the metadata associated with them (e.g., the number of edge cases).

Exposing sensitive systems and models to the outside world might provide some companies pause, but Choi asserts that Bobidi "auto-expires" models after a certain number of days so that they can't be reverse-engineered. Customers pay for service based on the number of "legit" attempts made by the community, which works out to a dollar ($0.99) per 10 attempts.

Choi notes that the amount of money developers can make through Bobidi — $10 to $20 per hour — is substantially above the minimum wage in many regions around the world. Assuming Choi's estimations are rooted in fact, Bobidi bucks the trend in the data science industry, which tends to pay data validators and labelers poorly. The annotators of the widely used ImageNet computer vision dataset made a median wage of $2 per hour, one study found, with only 4% making more than $7.25 per hour.

Pay structure aside, crowd-powered validation isn't a new idea. In 2017, the Computational Linguistics and Information Processing Laboratory at the University of Maryland launched a platform called Break It, Build It that let researchers submit models to users tasked with coming up with examples to defeat them. Elsewhere, Meta maintains a platform called Dynabench that has users "fool" models designed to analyze sentiment, answer questions, detect hate speech and more.

But Bae and Choi believe the "gamified" approach will help Bobidi stand out from the pack. While it's early days, the vendor claims to have customers in augmented reality and computer vision startups, including Seerslab, Deepixel and Gunsens.

The traction was enough to convince several investors to pledge money toward the venture. Today, Bobidi closed a $5.5 million seed round with participation from Y Combinator, We Ventures, Hyundai Motor Group, Scrum Ventures, New Product Experimentation (NPE) at Meta, Lotte Ventures, Atlas Pac Capital and several undisclosed angel investors.

Of note, Bobidi is among the first investments for NPE, which shifted gears last year from building consumer-facing apps to making seed-stage investments in AI-focused startups. When contacted for comment, head of NPE investments Sunita Parasuraman said via email: "We're thrilled to back the talented founders of Bobidi, who are helping companies better validate AI models with an innovative solution driven by people around the globe."

"Bobidi is a mashup between community and AI, a unique combination of expertise that we share," Choi added. "We believe that the era of big data is ending and we're about to enter the new era of quality data. It means we are moving from the era — where the focus was to build the best model given with the datasets — to the new era, where people are tasked to find the best dataset given with the model-complete opposite approach."

Choi said that the proceeds from the seed round will be put toward hiring — Bobidi currently has 12 employees — and building "customer insights experiences" and various "core machine learning technologies." The company hopes to triple the size of its team by the end of the year despite economic headwinds.

Wed, 13 Jul 2022 23:00:00 -0500 en-US text/html https://www.yahoo.com/now/bobidi-launches-reward-developers-testing-110006471.html
Killexams : With 26% CAGR, Learning Analytics Market Size worth USD 8.3 Billion in 2027

The MarketWatch News Department was not involved in the creation of this content.

Jul 08, 2022 (Heraldkeepers) -- Market Overview

The learning analytics market is expected to reach USD 8.3 Million at a CAGR of 26%. The historic forecast period value was valued at USD 3.1 Billion. The learning analytics market has a lot of room for worldwide expansion in its primary functional geographies, thanks to rising needs and trends. The science of analytical thinking is what learning analytics is all about. The interactive learning interfaces provide the necessary support for learning analytics. These are regarded as essential components of the strategy. Learning analytics employs a combination of learning and data analysis in conjunction with human aspects. The techniques used in learning analytics aid in better decision-making.

The coronavirus epidemic has presented the healthcare industry with numerous problems and difficulties. Even though demand for many types of medical services increased during the various waves of the coronavirus pandemic, other markets and business units face numerous hurdles. Businesses that use traditional methods are having a difficult time due to operational restrictions and lockdown imposed by various governments. To deal with the pandemic’s bad effects, it’s critical to turn to the world of digitization and investigate what the learning analytics industry has to offer in terms of analytical thinking.

Click Here to Get trial Premium Report @ https://www.marketresearchfuture.com/sample_request/5634

Market Segmentation

Based on the applications, the market is segmented into People Acquisition and Retention, Curriculum Development and Intervention Management, Performance Management, Budget, and Finance Management, Operations Management, and Others.

Based on the components, the market is segmented into Software, Services (Managed Services, Professional Services, Consulting, and Support and Maintenance).

Based on the end-users, the market is segmented into Academic, K-12, Higher Education, Enterprise/corporate, Small and Medium-sized Enterprises (SMEs), and Large Enterprises.

Based on the market's deployment model, it is segmented into On-premises and Cloud.

Based on the region, the market is segmented into North America, Europe, Asia-Pacific, and the Rest of the World (RoW).

Regional Classification

By the end of the global forecast period, the North American region is predicted to have the largest growth in the learning analytics market, with the emergence of elements that will aid in gaining market supremacy. IT, sales and marketing, supply chain mechanisms, finance and HR, shop floor, and product management are all part of the learning analytics sector. The fundamental reason for this is the rapid growth of technology, which has been accompanied by widespread and early adoption of these solutions by the target audience in this region.

Browse Full Report Details @ https://www.marketresearchfuture.com/reports/learning-analytics-market-5634

Industry News

The major key players in the market are IBM (US), TIBCO (US), D2L (Canada), Microsoft (US), Ellucian (US), Hobsons (US), Oracle (US), Civitas Learning (US), Zogo Technologies LLC (US), SAP (Germany), InetSoft (US), SAS Institute (US), Tableau Software (US), Certica Solutions (US), and MicroStrategy (US). The market opportunities for learning analytics aid in anticipating how key market players will seize them and turn them into something profitable for the worldwide market. These propelling market prospects also play an important role in analyzing market competition and drawing a competitive graph that will help regions gain market domination by the conclusion of the projection term.

**Top Trending Reports**

Frequently Asked Questions:

  1. What is the possibility for CAGR in the future for the learning analytics market?
  2. Who are the contenders operating in the learning analytics market?
  3. What is the stance for growth in the learning analytics market?
  4. What is the strategic product offering in the learning analytics market?

About Market Research Future:

At Market Research Future (MRFR), we enable our customers to unravel the complexity of various industries through our Cooked Research Report (CRR), Half-Cooked Research Reports (HCRR), Raw Research Reports (3R), Continuous-Feed Research (CFR), and Market Research & Consulting Services.

MRFR team have supreme objective to provide the optimum quality market research and intelligence services to our clients. Our market research studies by products, services, technologies, applications, end users, and market players for global, regional, and country level market segments, enable our clients to see more, know more, and do more, which help to answer all their most important questions.

Contact:

Market Research Future (Part of Wantstats Research and Media Private Limited)

99 Hudson Street, 5Th Floor

New York, NY 10013

United States of America

+1 628 258 0071 (US)

+44 2035 002 764 (UK)

Email: sales@marketresearchfuture.com

Website: https://www.marketresearchfuture.com

COMTEX_409843070/2582/2022-07-08T01:07:55

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Thu, 07 Jul 2022 17:07:00 -0500 en-US text/html https://www.marketwatch.com/press-release/with-26-cagr-learning-analytics-market-size-worth-usd-83-billion-in-2027-2022-07-08
Killexams : How Congress, the USPTO and firms can fix 101 woes: in-house

Counsel at IBM, Novartis, BMS and three other companies say legislation is their best option now SCOTUS has declined to hear American Axle

Unlock this article.

The content you are trying to view is exclusive to our subscribers.

To unlock this article:

Take a Free Trial or Login

Rani reports on all aspects of IP in the US and the Americas, particularly trademarks and copyright. Based in New York, she covers in-house and private practice lawyers' concerns and insights into the market.

Fri, 08 Jul 2022 04:33:00 -0500 en text/html https://www.managingip.com/article/2abz4wh8cm8hhs4277474/how-congress-the-uspto-and-firms-can-fix-101-woes-in-house
000-M90 exam dump and training guide direct download
Training Exams List