Memorize 000-595 Latest Topics before attempting real exam

killexams.com served legit, valid and up to date 000-595 Practice Test with Actual Exam Questions and Answers for new subjects of IBM 000-595 Exam. Practice our Real 000-595 Questions and Answers to enhance your knowledge and pass your 000-595 exam in first attempt. We are making sure that your success when you face 000-595 exam in actual exam.

Exam Code: 000-595 Practice test 2022 by Killexams.com team
IBM Maximo Asset Management V7.5 Fundamentals
IBM Fundamentals approach
Killexams : IBM Fundamentals approach - BingNews https://killexams.com/pass4sure/exam-detail/000-595 Search results Killexams : IBM Fundamentals approach - BingNews https://killexams.com/pass4sure/exam-detail/000-595 https://killexams.com/exam_list/IBM Killexams : IBM: latest Earnings Are Not A Sign Of Fundamental Change
Entrance of modern office tower at City of Zürich.

Michael Derrer Fuchs

In the past few months International Business Machines (NYSE:IBM) has turned into one of the best performing tech names. Since I first covered the company in January of 2021 IBM returned 17%, compared to merely 8% for the broader equity market.

IBM total return Chart
Data by YCharts

During this timeframe the spin-off of Kyndryl (KD) was completed and now that the underperforming assets have been unloaded, expectations around the 'New IBM' are running high. Unfortunately, however, the strong share price performance since November of last year has little to do with IBM's fundamentals.

As we see in the graph below, the iShares Edge MSCI USA Momentum Factor ETF (MTUM) peaked also in November of last year and since then the gap with the iShares Edge MSCI USA Value Factor ETF (VLUE) has been expanding.

MTUM vs VLUE price
Data by YCharts

As expectations of monetary tightening begun to surface and inflationary pressures intensified, high duration and momentum stocks begun to underperform the lower duration value companies. I talked about this dynamic in my latest analysis called 'The Cloud Space In Numbers: What Matters The Most', where I showed why the high-growth names were at risk. More specifically, I distinguished between the companies in the bottom left-hand corner and those in the upper right-hand corner in the graph below.

Revenue Growth (fwd) vs. P/E Non-GAAP (fwd) - November 2021

prepared by the author, using data from Seeking Alpha

As we see in the graph below, the high flyers, such as Workday (WDAY), Salesforce (CRM) and Adobe (ADBE), have become the worst performers, while companies like IBM and Oracle (ORCL) that were usually associated with low expected growth and low valuation multiples became the new stars.

IBM vs peers price
Data by YCharts

Although this was good news for value investors as a whole and is a trend that could easily continue, we should distinguish between strong business performance and market-wide forces. Having said that, IBM shareholders should not simply assume that the strong share price performance is a sign of strong execution. Needless to say, the Kyndryl disastrous performance of losing 75% of its value in a matter of months also lies on the shoulders of current management of IBM.

Kyndryl market cap
Data by YCharts

A Closer Look At IBM's latest Earnings

IBM's recently reported quarterly numbers once again disappointed and the management seems to have largely attributed the U.S. dollar movement to the slightly lower guidance.

IBM 2022 Outlook

IBM Earnings Release

Alongside the guidance gross margins also fell across the board, with the exception of the Financing division, which is relatively small to the other business units.

IBM Q2 2022 margins

IBM Q2 2022 Earnings Release

Rising labour and component costs were also to blame during the quarter and the management is addressing these through pricing actions which should take some time.

Although this is likely true, IBM is also reducing spend on research and development and selling, general and administrative functions. Such actions are usually taken as a precaution during downturns, however, consistent lower spend in those areas could often have grave consequences.

IBM fixed costs

prepared by the author, using data from annual and quarterly reports

Last but not least, the reported EPS numbers from continuing operations should also be adjusted as I have outlined before.

IBM Q2 2022 EPS

IBM Q2 2022 Earnings Release

I usually exclude the royalty income and all income/expenses grouped in the 'other' category. These expenses/income usually have little to do with IBM's ongoing business and as such I deem them to be irrelevant for long-term shareholders.

IBM Intellectual Property Income

IBM Annual Report 2021

IBM other income

IBM Annual Report 2021

On an adjusted basis, EPS increased from $1.08 in Q2 2021 to $1.33 in Q2 2022, which although is a notable increase remains low. Just as a back of the envelope calculation, if we annualize the last quarterly result, we end up with a total EPS number of $5.3 or a forward P/E ratio of almost 25x. Given all the difficulties facing IBM and its growth profile, this still appears as too high.

Has Anything Changed Following The latest Earnings?

As expected, IBM continued on its strategy to fuel its growth through a frenzy of acquisitions and divestitures. Following the Kyndryl spin-off, the company completed four deals in a matter of just few months.

IBM acquire Envizi

Seeking Alpha

IBM acquire Sentaca

Seeking Alpha

IBM acquire Randori

Seeking Alpha

IBM acquire Databand.ai

Seeking Alpha

As I have said before, all that does not bode well for the prospects of IBM's legacy businesses. Moreover, the management does not seem to be focused on organic growth numbers in their quarterly reviews which is even more worrisome.

Now that the underperforming assets have been off-loaded, IBM's dividend payments are still too high relative to its adjusted income.

IBM net income versus dividend

prepared by the author, using data from annual and quarterly reports

* adjusted for Intellectual property and custom development income, Other (income) and expense and Income/(loss) from discontinued operations, net of tax

As previously noted, this puts the company between a rock and a hard place. However, reducing or discontinuing the dividend could potentially result in an exodus of long-term shareholders.

We should also mention that IBM has been barely paying any taxes over latest years due to various tax credits (see below). This, however, is gradually changing and will likely provide yet another headwind on EPS numbers in the future.

IBM tax expense

prepared by the author, using data from annual and quarterly reports

Even though the narrative around IBM has been largely focused on its business turning around, the company's free cash flow per share continues to decline.

IBM free cash flow

prepared by the author, using data from annual and quarterly reports

Conclusion

A potential upside based on a successful turnaround story of IBM that is gravitating around the hybrid cloud is a major reason for many current and potential shareholders of IBM to hope for a light at the end of the tunnel. However, little seems to have changed at IBM following the spin-off of Kyndryl and a declining business also creates a significant moral hazard problem for management where more risk taking is incentivized. All that combined with the fact that IBM is doing M&A deals almost on a monthly basis, creates significant risks for long-term owners of the business.

Mon, 25 Jul 2022 03:52:00 -0500 en text/html https://seekingalpha.com/article/4525527-ibm-recent-earnings-not-sign-fundamental-change
Killexams : Search IBM Courses No result found, try new keyword!Once enrolled you can access the license in the Resources area <<< This course, Advanced Machine Learning and Signal Processing, is part of the IBM Advanced ... about the fundamentals of Linear ... Thu, 22 Apr 2021 07:23:00 -0500 text/html https://www.usnews.com/education/skillbuilder/provider-search/ibm Killexams : Fundamentals of Scalable Data Science No result found, try new keyword!This is the first course of a series of courses towards the IBM Advanced Data Science Specialization ... In this course we teach you the fundamentals of Apache Spark using python and pyspark. Sat, 12 Jun 2021 11:47:00 -0500 text/html https://www.usnews.com/education/skillbuilder/fundamentals-of-scalable-data-science-0_sUpST4RAEeawAApvKZgcCQ Killexams : Your cloud should adapt to you, not the other way around

Know your apps, know your workloads, know your team. In the second installment of this occasional series on cloud migration, Don Boulia, GM of IBM Cloud Developer Services, and guest Lauren Nelson, principal analyst at Forrester Research, discuss the importance of designing a cloud to fit your needs, instead of redesigning your business for cloud.

When adopting cloud, what should you look for to make sure it works for you?
Nelson: I’d say it’s more important to first build a culture that fosters learning, innovation, and change — so the very way your teams think and work is a tool to be leveraged throughout the transformation. Choose a cloud platform that adapts with your unique business needs, large or small. As your team learns fundamentals, your cloud platform should evolve with your team’s working methods, pairing services to pain points, appropriate security levels. The education will be continuous.

Boulia: That’s a good way to say it. As a cloud provider, we view this from a couple of dimensions: which different classes of workloads will be run on cloud, and the tools necessary to serve these increasingly complex workloads. Teams will need to manage the unique characteristics of datasets that can be ingested into AI, IoT and blockchain, and can explore how these technologies can solve problems in new ways. When choosing a cloud provider, teams should look at platforms that offer access to these services, and can securely connect all their data into them to unlock the unique, and possibly unknown, value of information stored in existing systems or streaming into the cloud.

Another element to keep in mind is if your platform will let you stay in control over your data, which is at the root of many security issues. Many teams are most concerned with security, but the true worry is how data will be impacted and accessed. For example, you may have a security-rich cloud, but can you manage encryption keys and other assets that let you decide where data resides and who can see it? These are capabilities enterprises need to move workloads to the cloud.

How do you design the security of your cloud to meet the needs of your data?
Nelson: Realistically, a strong cloud strategy should optimize existing apps along with cloud-native applications. With that in mind, it is imperative to get the right balance of user experience and governance. This may be governance policies that alert your DevOps team when they’re out of compliance, or building an architecture that removes them from the intricacies of storing and securing data.  For many it can be a combination, but considering latest regulations like GDPR, staying ahead of the control of your data is crucial.

Boulia: Data location, compute, and isolation all direct the strategy, especially when dealing with performance needs. But you also want the flexibility of accessing cloud native technologies like AI, blockchain and IoT, that can drive innovation and revenue from this data. Therefore, a hybrid cloud strategy is often a smart way to approach your migration to the cloud, as your needs could range from the security that dedicated servers and bare metal provide to building and running cloud native applications. Security should be managed at every level of whatever your stack looks like, ensuring data is encrypted whether it’s on-premises, in the public cloud, or on an isolated server. Therefore it’s also important to ensure development teams have access to the right training to understand how to weave in these sorts of security controls into apps.

How can I use cloud to explore and build with emerging technologies?
Nelson: What’s next depends on the organization. IoT, machine learning, AI, and blockchain are “now” for many organizations, and they’re being accessed via the cloud.

Boulia: The key is recognizing that cloud is a springboard for innovation. First, you should take inventory of where your data lives, and then determine how the right cloud strategy can help you bring that data to life in ways that will make the most impact. There are limitless opportunities to explore the potential of untapped data, and companies are already doing this today.

KONE, a global leader in elevators and escalators, is an example of how the cloud can be put to work as they help to move one billion people around daily within buildings. Alongside Watson IoT services on IBM Cloud, KONE connects, remotely monitors and optimizes the data coming in from elevators and escalators. Through analyzing vast amounts of data from embedded sensors in elevators and escalators, they’re able to identify and predict issues, as well as quickly respond to help keep people flowing across urban environments.

Stories like this are only scratching the surface of what cloud technologies can do, if companies choose a cloud that meets them where they are in their journey – instead of immediately upending all processes.

Sun, 10 Jul 2022 12:00:00 -0500 en-US text/html https://sdtimes.com/cloud/your-cloud-should-adapt-to-you-not-the-other-way-around/
Killexams : Why the Pandemic Downturn Has Been Amid the “Most Benign” for the Office Sector

When COVID-19 kept a big part of the U.S. office workforce home throughout much of 2020 and 2021, the fear in the commercial real estate industry was that the office sector would take a huge hit. Even now, some media headlines continue to predict the permanent obsolescence of the office as a concept. But a June report from Moody’s Analytics found that the pandemic has not had as a great of a negative impact on office sector fundamentals as did the Great Financial Crisis and previous real estate downturns.

The report found that the U.S. office sector registered more moderate occupancy and rent declines in the wake of the pandemic than it did during the previous three downturns; that office loan delinquency rates never spiraled out of control; and that office equity returns continued to be largely positive for institutional investors.

There are also indications that in spite of the challenges associated with remote work, commercial real estate investors remain bullish on the office sector, especially when it comes to class-A properties in strong locations.

To find out what’s going on in the sector, WMRE recently spoke with Kevin Fagan, head of CRE economic analysis with Moody’s Analytics and one of the authors of the June report. Fagan talks about a comparative analysis of office properties today vs. previous downturns and explains what might have softened the impact of the pandemic on the sector.

The following Q&amp;A has been edited for length, style and clarity.

WMRE: How does the state of the office sector today compare to what happened during previous real estate downturns?

kevin-fagan-web.jpgKevin Fagan: The loss of rent, occupancy and value in office buildings and defaults on office loans have been significantly less in this downturn than the typical boom-bust cycles over the past 40 years. We’ve been broadly tracking office performance metrics at Moody's Analytics CRE (formerly REIS), and two years after the 2020 recession a lasting, historically deep deterioration stemming from the pandemic has yet to materialize in traditional indicators.

WMRE: What trends in property fundamentals have been the same during the pandemic downturn with previous ones, and what trends/themes have differed? How important are those differences?

Kevin Fagan: Office sector fundamentals in this down cycle did not follow past cycles, which came on the heels of loss of office-using employment and corporate defaults that traditionally soften demand for office space and, therefore, office values and investment. Office-using employment and corporate performance were resilient after the 2020 hit. While employment losses in 2020 as a percent of total U.S. jobs were the deepest since the Great Depression in 1929, the vast majority of those lost jobs were in the service and hospitality sectors. Those are are generally lower-pay, lower-skill jobs than typical professional services, office-using jobs.

Therefore, in combination with government fiscal and monetary support, the economy and corporations rebounded quickly without reducing or abandoning their office spaces en masse, despite not actively utilizing the space during the triage work-from-home period of the pandemic. Of course that wasn't true across the board, so pain in business world did ultimately flow through to the office sector, just not nearly on the scale of past cycles that are financially driven and hit professional service firms hard.

WMRE: Can you explain why office revenues didn’t take as big of a hit during the pandemic as during previous market downturns?

Kevin Fagan: In the period between 2008 and 2020, we didn't see as much overbuilding in the office sector compared to the usual ramp-up before other downturns. This largely had to do with High Volatility Commercial Real Estate Rules (banking regulator rules for high-risk projects that require a more robust capital buffer than less risky projects to mitigate the bank's risk) that resulted in less construction lending in this cycle than prior up cycles.

WMRE: Why hasn't there been a huge impact from the pandemic on office equity returns?

Kevin Fagan: Unlike the stock market or other trading markets, commercial real estate capital markets move slowly. Price is determined by sales. Volume of office building sales was down over the past two years, and those that did trade were either already in process before the pandemic or simply weren't the forced, distressed sales that percolate during downturns.

Broadly speaking, building owners didn't have the revenue decline that needs to happen to force a sale, either by the ownership trying to cut losses or lenders selling through a foreclosure or lender REO. If more distressed sales were forced to happen, we likely would have seen a greater drop in office values. But, as it was, the implied values from the sales that happened and [their] appraisals remained largely intact. Indeed, many saw implied value increases. Repeat sale indices from many different commercial real estate analytics companies, including Moody's, recorded a slight dip at the national level, but a double-digit office price increase over 2021. 

WMRE: What was one of the biggest surprises your research revealed?

Kevin Fagan: For us, the research wasn't surprising because our business is to track these metrics closely. Those that don't track them as closely may find it surprising to learn that this has been one of the most benign cycles for offices, given the deluge of negative headlines about the apocalyptic future of office. Our intent was to make it clear that if that apocalypse is to occur that it hasn't happened yet, relative to the numbers in past cycles.

WMRE: How does the implementation of remote and hybrid work correlate with office vacancies?

Kevin Fagan: We have seen some early evidence that markets with the lowest physical office utilization rates—where companies have been slower to make a "return to the office"—have higher vacancy rates. It's not clear by any means yet that this trend will continue, but could be because “the great return to office” really will not begin until fall 2022. Many companies will require "anchor days" each week when employees must be in the office, so we expect to see a resurgence in occupancy rates in the big, dense urban markets during 2023.

The biggest question now is how or if companies can shed significant space if employees are in the office less, but there will be a long experimental period, with different companies taking different [routes] that best suit them. Very few real estate pros or consultancy firms that are advising tenants on space usage believe that companies can translate their employees being in the office 40 to 50 percent less on average to 40 to 50 less office space occupied. Companies that need their employees to interact or collaborate will have to resolve how to keep interactions happening.

There have been excellent studies that show "incidental collision" between workers is very important to the overall productivity of a company, and those collisions are reduced dramatically every day or week fewer employees come together in the office. Companies, therefore, will need to weigh the ability to cultivate collaborative workforces against a desire to save on real estate costs.

WMRE: What are the biggest issues facing the office sector today that need to be resolved? How much time do you expect it will take to resolve them?

Kevin Fagan: The biggest issue by far is how companies come to rationalize their space needs against what makes their workforce the most efficient. Real estate costs would be nice to cut, but only account for around 4 percent of a company’s revenue—[and] much less for big tech firms. On the other hand, the costs for human capital are typically 5 to 10 times higher than the cost of office space and very important to get right.

Remote working actually has a 40-year history, starting with IBM in 1983. There have been a lot of real world experiments and academic studies on the pros and cons of diversified workforce practices for workers, employers and society in general.

That experiment will have to continue on a much larger scale coming off the pandemic, with companies trying out anchor days, a minimum or set number of days when employees are required to be in the office; allowing team managers to try different in-office strategies, like a mix of reservation-style "hot desking" and private offices; taking more co-working or flex office space; and all other manner of in-office approaches to get workers together to interact and collaborate.

It's unlikely that the first approach for a company will be the permanent state of things, so we're most likely looking at multiple years of experimentation to see what works best. This will enable tenants to rationalize their space needs longer term, but it will be a number of years before they can make a determination and their current leases expire.

WMRE: How long do you expect it to take for the market to gain clarity on the ultimate fate of the office sector? Do you have some scenarios for how it might go that you can share with WMRE readers?

Kevin Fagan: The direction that the office market goes will largely require us to wait until companies start making those long-term decisions after returning to the office. Since the delay from the Omicron variant and the lull of the summer, it's likely we'll see very early indicators starting in the fall of 2022. However, those early indicators could be spurious anomalies, and we'll have to be careful before making dramatic conclusions of apocalypse or no apocalypse.

We'll likely get more hard, reliable data over the course of the second half of 2023 through 2025. And, even if we do see some widespread reduction in demand for office space per employee, that doesn't necessarily portend a mass decline in office rent, occupancy and value.

Starting in the 1990s there was a major shift in the office market that caused companies to eventually cut the amount of space per employee in half, from about 250 sq. ft. feet to about 125 sq. ft. per employee. Over that period, however, office rents and values continued to grow, especially for CBD offices.

The natural growth of office-using employment, assuming we don't have a glut of new office construction, could cause absorption of space even with less demand per firm and employee.

WMRE: Is there anything else you want to add based on your findings?

Kevin Fagan: Office management is very likely to change significantly going forward, with more of a hospitality kind of angle. Also, tenants will be more selective with office space. They’re likely to either want deeply discounted rents or high-amenity type spaces. Owners and operators that Excellerate their services, focus on health and sustainability, common area spaces, a variety of food options and so forth are the most likely to survive and thrive in the post-pandemic office world.

Mon, 11 Jul 2022 12:01:00 -0500 en text/html https://www.wealthmanagement.com/office/why-pandemic-downturn-has-been-amid-most-benign-office-sector
Killexams : Latest Memo From Howard Marks: I Beg To Differ
Financial portfolio and assets manager analyzing investment statistics and indicators on dashboard for trading products. Business and finance strategy. Data analytics for stock market investing.

NicoElNino

I've written many times about having joined the investment industry in 1969 when the "Nifty Fifty" stocks were in full flower. My first employer, First National City Bank, as well as many of the other "money-center banks" (the leading investment managers of the day), were enthralled with these companies, with their powerful business models and flawless prospects. Sentiment surrounding their stocks was uniformly positive, and portfolio managers found great safety in numbers. For example, a common refrain at the time was "you can't be fired for buying IBM," the era's quintessential growth company.

I've also written extensively about the fate of these stocks. In 1973-74, the OPEC oil embargo and the resultant recession took the S&amp;P 500 Index down a total of 47%. And many of the Nifty Fifty, for which it had been thought that "no price was too high," did far worse, falling from peak p/e ratios of 60-90 to trough multiples in the single digits. Thus, their devotees lost almost all of their money in the stocks of companies that "everyone knew" were great. This was my first chance to see what can happen to assets that are on what I call "the pedestal of popularity."

In 1978, I was asked to move to the bank's bond department to start funds in convertible bonds and, shortly thereafter, high yield bonds. Now I was investing in securities most fiduciaries considered "uninvestable" and which practically no one knew about, cared about, or deemed desirable... and I was making money steadily and safely. I quickly recognized that my strong performance resulted in large part from precisely that fact: I was investing in securities that practically no one knew about, cared about, or deemed desirable. This brought home the key money-making lesson of the Efficient Market Hypothesis, which I had been introduced to at the University of Chicago Business School: If you seek superior investment results, you have to invest in things that others haven't flocked to and caused to be fully valued. In other words, you have to do something different.

The Essential Difference

In 2006, I wrote a memo called Dare to Be Great. It was mostly about having high aspirations, and it included a rant against conformity and investment bureaucracy, as well as an assertion that the route to superior returns by necessity runs through unconventionality. The element of that memo that people still talk to me about is a simple two-by-two matrix:

Conventional Behavior Unconventional Behavior
Favorable Outcomes Average good results Above average results
Unfavorable Outcomes Average bad results Below average results

Here's how I explained the situation:

Of course, it's not easy and clear-cut, but I think it's the general situation. If your behavior and that of your managers are conventional, you're likely to get conventional results - either good or bad. Only if the behavior is unconventional is your performance likely to be unconventional... and only if the judgments are superior is your performance likely to be above average.

The consensus opinion of market participants is baked into market prices. Thus, if investors lack the insight that is superior to the average of the people who make up the consensus, they should expect average risk-adjusted performance.

Many years have passed since I wrote that memo, and the investing world has gotten a lot more sophisticated, but the message conveyed by the matrix and the accompanying explanation remains unchanged. Talk about simple - in the memo, I reduced the issue to a single sentence: "This just in: You can't take the same actions as everyone else and expect to outperform."

The best way to understand this idea is by thinking through a highly logical and almost mathematical process (greatly simplified, as usual, for illustrative purposes):

  • A certain (but unascertainable) number of dollars will be made over any given period by all investors collectively in an individual stock, a given market, or all markets taken together. That amount will be a function of (a) how companies or assets fare in fundamental terms (e.g., how their profits grow or decline) and (b) how people feel about those fundamentals and treat asset prices.

  • On average, all investors will do average.

  • If you're happy doing average, you can simply invest in a broad swath of the assets in question, buying some of each in proportion to its representation in the relevant universe or index. By engaging in average behavior in this way, you're guaranteed average performance. (Obviously, this is the idea behind index funds.)

  • If you want to be above average, you have to depart from consensus behavior. You have to overweight some securities, asset classes, or markets and underweight others. In other words, you have to do something different.

  • The challenge lies in the fact that (a) market prices are the result of everyone's collective thinking and (b) it's hard for any individual to consistently figure out when the consensus is wrong and an asset is priced too high or too low.

  • Nevertheless, "active investors" place active bets in an effort to be above average.

    • Investor A decides stocks as a whole are too cheap, and he sells bonds in order to overweight stocks. Investor B thinks stocks are too expensive, so she moves to an underweighting by selling some of her stocks to Investor A and putting the proceeds into bonds.

    • Investor X decides a certain stock is too cheap and overweights it, buying from investor Y, who thinks it's too expensive and therefore wants to underweight it.

  • It's essential to note that in each of the above cases, one investor is right and the other is wrong. Now go back to the first bullet point above: Since the total dollars earned by all investors collectively are fixed in amount, all active bets, taken together, constitute a zero-sum game (or negative-sum after commissions and other costs). The investor who is right earns an above-average return, and by definition, the one who's wrong earns a below-average return.

  • Thus, every active bet placed in the pursuit of above-average returns carries with it the risk of below-average returns. There's no way to make an active bet such that you'll win if it works but not lose if it doesn't. Financial innovations are often described as offering some version of this impossible bargain, but they invariably fail to live up to the hype.

  • The bottom line of the above is simple: You can't hope to earn above-average returns if you don't place active bets, but if your active bets are wrong, your return will be below average.

Investing strikes me as being very much like golf, where playing conditions and the performance of competitors can change from day to day, as can the placement of the holes. On some days, one approach to the course is appropriate, but on other days, different tactics are called for. To win, you have to either do a better job than others of selecting your approach or executing on it or both.

The same is true for investors. It's simple: If you hope to distinguish yourself in terms of performance, you have to depart from the pack. But, having departed, the difference will only be positive if your choice of strategies and tactics is correct and/or you're able to execute better.

Second-Level Thinking

In 2009, when Columbia Business School Publishing was considering whether to publish my book The Most Important Thing, they asked to see a trial chapter. As has often been my experience, I sat down and described a concept I hadn't previously written about or named. That description became the book's first chapter, addressing one of its most important topics: second-level thinking. It's certainly the concept from the book that people ask me about most often.

The idea of second-level thinking builds on what I wrote in Dare to Be Great. First, I repeated my view that success in investing means doing better than others. All active investors (and certainly money managers hoping to earn a living) are driven by the pursuit of superior returns.

But that universality also makes beating the market a difficult task. Millions of people are competing for each dollar of investment gain. Who'll get it? The person who's a step ahead. In some pursuits, getting up to the front of the pack means more schooling, more time in the gym or the library, better nutrition, more perspiration, greater stamina, or better equipment. But in investing, where these things count for less, it calls for more perceptive thinking... at what I call the second level.

The basic idea behind second-level thinking is easily summarized: In order to outperform, your thinking has to be different and better.

Remember, your goal in investing isn't to earn average returns; you want to do better than average. Thus, your thinking has to be better than that of others - both more powerful and at a higher level. Since other investors may be smart, well-informed, and highly computerized, you must find an edge they don't have. You must think of something they haven't thought of, see things they miss, or bring insight they don't possess. You have to react differently and behave differently. In short, being right may be a necessary condition for investment success, but it won't be sufficient. You have to be more right than others... which by definition means your thinking has to be different.

Having made the case, I went on to distinguish second-level thinkers from those who operate at the first level:

First-level thinking is simplistic and superficial, and just about everyone can do it (a bad sign for anything involving an attempt at superiority). All the first-level thinker needs is an opinion about the future, as in "The outlook for the company is favorable, meaning the stock will go up."

Second-level thinking is deep, complex, and convoluted. The second-level thinker takes a great many things into account:

  • What is the range of likely future outcomes?

  • What outcome do I think will occur?

  • What's the probability I'm right?

  • What does the consensus think?

  • How does my expectation differ from the consensus?

  • How does the current price for the asset comport with the consensus view of the future, and with mine?

  • Is the consensus psychology that's incorporated in the price too bullish or bearish?

  • What will happen to the asset's price if the consensus turns out to be right, and what if I'm right?

The difference in workload between first-level and second-level thinking is clearly massive, and the number of people capable of the latter is tiny compared to the number capable of the former.

First-level thinkers look for simple formulas and easy answers. Second-level thinkers know that success in investing is the antithesis of simple.

Speaking about difficulty reminds me of an important idea that arose in my discussions with my son Andrew during the pandemic (described in the memo Something of Value, published in January 2021). In the memo's extensive discussion of how efficient most markets have become in latest decades, Andrew makes a terrific point: "Readily available quantitative information with regard to the present cannot be the source of superior performance." After all, everyone has access to this type of information - with regard to public U.S. securities, that's the whole point of the SEC's Reg FD (for fair disclosure) - and nowadays all investors should know how to manipulate data and run screens.

So, then, how can investors who are intent on outperforming hope to reach their goal? As Andrew and I said on a podcast where we discussed Something of Value, they have to go beyond readily available quantitative information with regard to the present. Instead, their superiority has to come from an ability to:

  • better understand the significance of the published numbers,

  • better assess the qualitative aspects of the company, and/or

  • better divine the future.

Obviously, none of these things can be determined with certainty, measured empirically, or processed using surefire formulas. Unlike present-day quantitative information, there's no source you can turn to for easy answers. They all come down to judgment or insight. Second-level thinkers who have better judgment are likely to achieve superior returns, and those who are less insightful are likely to generate inferior performance.

This all leads me back to something Charlie Munger told me around the time The Most Important Thing was published: "It's not supposed to be easy. Anyone who finds it easy is stupid." Anyone who thinks there's a formula for investing that guarantees success (and that they can possess it) clearly doesn't understand the complex, dynamic, and competitive nature of the investing process. The prize for superior investing can amount to a lot of money. In the highly competitive investment arena, it simply can't be easy to be the one who pockets the extra dollars.

Contrarianism

There's a concept in the investing world that's closely related to being different: contrarianism. "The investment herd" refers to the masses of people (or institutions) that drive security prices one way or the other. It's their actions that take asset prices to bull market highs and sometimes bubbles and, in the other direction, to bear market territory and occasional crashes. At these extremes, which are invariably overdone, it's essential to act in a contrary fashion.

Joining in the swings described above causes people to own or buy assets at high prices and to sell or fail to buy at low prices. For this reason, it can be important to part company with the herd and behave in a way that's contrary to the actions of most others.

Contrarianism received its own chapter in The Most Important Thing. Here's how I set forth the logic:

  • Markets swing dramatically, from bullish to bearish, and from overpriced to underpriced.

  • Their movements are driven by the actions of "the crowd," "the herd," and "most people." Bull markets occur because more people want to buy than sell, or the buyers are more highly motivated than the sellers. The market rises as people switch from being sellers to being buyers, and as buyers become even more motivated and the sellers less so. (If buyers didn't predominate, the market wouldn't be rising.)

  • Market extremes represent inflection points. These occur when bullishness or bearishness reaches a maximum. Figuratively speaking, a top occurs when the last person who will become a buyer does so. Since every buyer has joined the bullish herd by the time the top is reached, bullishness can go no further, and the market is as high as it can go. Buying or holding is dangerous.

  • Since there's no one left to turn bullish, the market stops going up. And if the next day one person switches from buyer to seller, it will start to go down.

  • So at the extremes, which are created by what "most people" believe, most people are wrong.

  • Therefore, the key to investment success has to lie in doing the opposite: in diverging from the crowd. Those who recognize the errors that others make can profit enormously from contrarianism.

To sum up, if the extreme highs and lows are excessive and the result of the concerted, mistaken actions of most investors, then it's essential to leave the crowd and be a contrarian.

In his 2000 book, Pioneering Portfolio Management, David Swensen, the former chief investment officer of Yale University, explained why investing institutions are vulnerable to conformity with current consensus belief and why they should instead embrace contrarianism. (For more on Swensen's approach to investing, see "A Case in Point" below.) He also stressed the importance of building infrastructure that enables contrarianism to be employed successfully:

Unless institutions maintain contrarian positions through difficult times, the resulting damage imposes severe financial and reputational costs on the institution.

Casually researched, consensus-oriented investment positions provide the little prospect for producing superior results in the intensely competitive investment management world.

Unfortunately, overcoming the tendency to follow the crowd, while necessary, proves insufficient to guarantee investment success... While courage to take a different path enhances chances for success, investors face likely failure unless a thoughtful set of investment principles undergirds the courage.

Before I leave the subject of contrarianism, I want to make something else very clear. First-level thinkers - to the extent they're interested in the concept of contrarianism - might believe contrarianism means doing the opposite of what most people are doing, so selling when the market rises and buying when it falls. But this overly simplistic definition of contrarianism is unlikely to be of much help to investors. Instead, the understanding of contrarianism itself has to take place at a second level.

In The Most Important Thing Illuminated, an annotated edition of my book, four professional investors and academics provided commentary on what I had written. My good friend Joel Greenblatt, an exceptional equity investor, provided a very apt observation regarding knee-jerk contrarianism: "... just because no one else will jump in front of a Mack truck barreling down the highway doesn't mean that you should." In other words, the mass of investors aren't wrong all the time, or wrong so dependably that it's always right to do the opposite of what they do. Rather, to be an effective contrarian, you have to figure out:

  • what the herd is doing;

  • why it's doing it;

  • what's wrong, if anything, with what it's doing; and

  • what you should do about it.

Like the second-level thought process laid out in bullet points on page four, intelligent contrarianism is deep and complex. It amounts to much more than simply doing the opposite of the crowd. Nevertheless, good investment decisions made at the best opportunities - at the most overdone market extremes - invariably include an element of contrarian thinking.

The Decision to Risk Being Wrong

There are only so many Topics I find worth writing about, and since I know I'll never know all there is to know about them, I return to some from time to time and add to what I've written previously. Thus, in 2014, I followed up on 2006's Dare to Be Great with a memo creatively titled Dare to Be Great II. To begin, I repeated my insistence on the importance of being different:

If your portfolio looks like everyone else's, you may do well, or you may do poorly, but you can't do differently. And being different is absolutely essential if you want a chance at being superior...

I followed that with a discussion of the challenges associated with being different:

Most great investments begin in discomfort. The things most people feel good about - investments where the underlying premise is widely accepted, the latest performance has been positive, and the outlook is rosy - are unlikely to be available at bargain prices. Rather, bargains are usually found among things that are controversial, that people are pessimistic about, and that have been performing badly of late.

But then, perhaps most importantly, I took the idea a step further, moving from daring to be different to its natural corollary: daring to be wrong. Most investment books are about how to be right, not the possibility of being wrong. And yet, the would-be active investor must understand that every attempt at success by necessity carries with it the chance for failure. The two are absolutely inseparable, as I described at the top of page three.

In a market that is even moderately efficient, everything you do to depart from the consensus in pursuit of above-average returns has the potential to result in below-average returns if your departure turns out to be a mistake. Overweighting something versus underweighting it; concentrating versus diversifying; holding versus selling; hedging versus not hedging - these are all double-edged swords. You gain when you make the right choice and lose when you're wrong.

One of my favorite sayings came from a pit boss at a Las Vegas casino: "The more you bet, the more you win when you win." Absolutely inarguable. But the pit boss conveniently omitted the converse: "The more you bet, the more you lose when you lose." Clearly, those two ideas go together.

In a presentation I occasionally make to institutional clients, I employ PowerPoint animation to graphically portray the essence of this situation:

  • A bubble drops down, containing the words "Try to be right." That's what active investing is all about. But then a few more words show up in the bubble: "Run the risk of being wrong." The bottom line is that you simply can't do the former without also doing the latter. They're inextricably intertwined.

  • Then another bubble drops down, with the label "Can't lose." There are can't-lose strategies in investing. If you buy T-bills, you can't have a negative return. If you invest in an index fund, you can't underperform the index. But then two more words appear in the second bubble: "Can't win." People who use can't-lose strategies by necessity surrender the possibility of winning. T-bill investors can't earn more than the lowest of yields. Index fund investors can't outperform.

  • And that brings me to the assignment I imagine receiving from unenlightened clients: "Just apply the first set of words from each bubble: Try to outperform while employing can't-lose strategies." But that combination happens to be unavailable.

The above shows that active investing carries a cost that goes beyond commissions and management fees: heightened risk of inferior performance. Thus, every investor has to make a conscious decision about which course to follow. Pursue superior returns at the risk of coming in behind the pack, or hug the consensus position and ensure average performance. It should be clear that you can't hope to earn superior returns if you're unwilling to bear the risk of sub-par results.

And that brings me to my favorite fortune cookie, which I received with dessert 40-50 years ago. The message inside was simple: The cautious seldom err or write great poetry. In my college classes in Japanese studies, I learned about the koan, which Oxford Languages defines as "a paradoxical anecdote or riddle, used in Zen Buddhism to demonstrate the inadequacy of logical reasoning and to provoke enlightenment." I think of my fortune that way because it raises a question I find paradoxical and capable of leading to enlightenment.

But what does the fortune mean? That you should be cautious because cautious people seldom make mistakes? Or that you shouldn't be cautious, because cautious people rarely accomplish great things?

The fortune can be read both ways, and both conclusions seem reasonable. Thus the key question is, "Which meaning is right for you?" As an investor, do you like the idea of avoiding error, or would you rather try for superiority? Which path is more likely to lead to success as you define it, and which is more feasible for you? You can follow either path, but clearly not both simultaneously.

Thus, investors have to answer what should be a very basic question: Will you (a) strive to be above average, which costs money, is far from sure to work, and can result in your being below average, or (b) accept average performance - which helps you reduce those costs but also means you'll have to look on with envy as winners report mouth-watering successes. Here's how I put it in Dare to Be Great II:

How much emphasis should be put on diversifying, avoiding risk, and ensuring against below-pack performance, and how much on sacrificing these things in the hope of doing better?

And here's how I described some of the considerations:

Unconventional behavior is the only road to superior investment results, but it isn't for everyone. In addition to superior skill, successful investing requires the ability to look wrong for a while and survive some mistakes. Thus each person has to assess whether he's temperamentally equipped to do these things and whether his circumstances - in terms of employers, clients and the impact of other people's opinions - will allow it... when the chips are down and the early going makes him look wrong, as it invariably will.

You can't have it both ways. And as in so many aspects of investing, there's no right or wrong, only right or wrong for you.

A Case in Point

The aforementioned David Swensen ran Yale University's endowment from 1985 until his passing in 2021, an unusual 36-year tenure. He was a true pioneer, developing what has come to be called "the Yale Model" or "the Endowment Model." He radically reduced Yale's holdings of public stocks and bonds and invested heavily in innovative, illiquid strategies such as hedge funds, venture capital, and private equity at a time when almost no other institutions were doing so. He identified managers in those fields who went on to generate superior results, several of whom earned investment fame. Yale's resulting performance beat almost all other endowments by miles. In addition, Swensen sent out into the endowment community a number of disciples who produced enviable performances for other institutions. Many endowments emulated Yale's approach, especially beginning around 2003-04 after these institutions had been punished by the bursting of the tech/Internet bubble. But few if any duplicated Yale's success. They did the same things, but not nearly as early or as well.

To sum up all the above, I'd say Swensen dared to be different. He did things others didn't do. He did these things long before most others picked up the thread. He did them to a degree that others didn't approach. And he did them with exceptional skill. What a great formula for outperformance.

In Pioneering Portfolio Management, Swensen provided a description of the challenge at the core of investing - especially institutional investing. It's one of the best paragraphs I've ever read and includes a two-word phrase (which I've bolded for emphasis) that for me reads like sheer investment poetry. I've borrowed it countless times:

...Active management strategies demand uninstitutional behavior from institutions, creating a paradox that few can unravel. Establishing and maintaining an unconventional investment profile requires acceptance of uncomfortably idiosyncratic portfolios, which frequently appear downright imprudent in the eyes of conventional wisdom.

As with many great quotes, this one from Swensen says a great deal in just a few words. Let's parse its meaning:

Idiosyncratic - When all investors love something, it's likely their buying will render it highly-priced. When they hate it, their selling will probably cause it to become cheap. Thus, it's preferable to buy things most people hate and sell things most people love. Such behavior is by definition highly idiosyncratic (i.e., "eccentric," "quirky," or "peculiar").

Uncomfortable - The mass of investors take the positions they take for reasons they find convincing. We witness the same developments they do and are impacted by the same news. Yet, we realize that if we want to be above average, our reaction to those inputs - and thus our behavior - should in many instances be different from that of others. Regardless of the reasons, if millions of investors are doing A, it may be quite uncomfortable to do B.

And if we do bring ourselves to do B, our action is unlikely to prove correct right away. After we've sold a market darling because we think it's overvalued, its price probably won't start to drop the next day. Most of the time, the hot asset you've sold will keep rising for a while, and sometimes a good while. As John Maynard Keynes said, "Markets can remain irrational longer than you can remain solvent." And as the old adage goes, "Being too far ahead of your time is indistinguishable from being wrong." These two ideas are closely related to another great Keynes quote: "Worldly wisdom teaches that it is better for the reputation to fail conventionally than to succeed unconventionally." Departing from the mainstream can be embarrassing and painful.

Uninstitutional behavior from institutions - We all know what Swensen meant by the word "institutions": bureaucratic, hidebound, conservative, conventional, risk-averse, and ruled by consensus; in short, unlikely mavericks. In such settings, the cost of being different and wrong can be viewed as highly unacceptable relative to the potential benefit of being different and right. For the people involved, passing up profitable investments (errors of omission) poses far less risk than making investments that produce losses (errors of commission). Thus, investing entities that behave "institutionally" are, by their nature, highly unlikely to engage in idiosyncratic behavior.

Early in his time at Yale, Swensen chose to:

  • minimize holdings of public stocks;

  • vastly overweight strategies falling under the heading "alternative investments" (although he started to do so well before that label was created);

  • in so doing, commit a substantial portion of Yale's endowment to illiquid investments for which there was no market; and

  • hire managers without lengthy track records on the basis of what he perceived to be their investment acumen.

To use his words, these actions probably appeared "downright imprudent in the eyes of conventional wisdom." Swensen's behavior was certainly idiosyncratic and uninstitutional, but he understood that the only way to outperform was to risk being wrong, and he accepted that risk with great results.

One Way to Diverge from the Pack

To conclude, I want to describe a latest occurrence. In mid-June, we held the London edition of Oaktree's biannual conference, which followed on the heels of the Los Angeles version. My assigned course at both conferences was the market environment. I faced a dilemma while preparing for the London conference because so much had changed between the two events: On May 19, the S&amp;P 500 was at roughly 3,900, but by June 21 it was at approximately 3,750, down almost 4% in roughly a month. Here was my issue: Should I update my slides, which had become somewhat dated, or reuse the LA slides to deliver a consistent message to both audiences?

I decided to use the LA slides as the jumping-off point for a discussion of how much things had changed in that short period. The key segment of my London presentation consisted of a stream-of-consciousness discussion of the concerns of the day. I told the attendees that I pay close attention to the questions people ask most often at any given point in time, as the questions tell me what's on people's minds. And the questions I'm asked these days overwhelmingly surround:

  • the outlook for inflation,

  • the extent to which the Federal Reserve will raise interest rates to bring it under control, and

  • whether doing so will produce a soft landing or a recession (and if the latter, how bad).

Afterward, I wasn't completely happy with my remarks, so I rethought them over lunch. And when it was time to resume the program, I went up on stage for another two minutes. Here's what I said:

All the discussion surrounding inflation, rates, and recession falls under the same heading: the short term. And yet:

  • We can't know much about the short-term future (or, I should say, we can't dependably know more than the consensus).

  • If we have an opinion about the short term, we can't (or shouldn't) have much confidence in it.

  • If we reach a conclusion, there's not much we can do about it - most investors can't and won't meaningfully revamp their portfolios based on such opinions.

  • We really shouldn't care about the short term - after all, we're investors, not traders.

I think it's the last point that matters most. The question is whether you agree or not.

For example, when asked whether we're heading toward a recession, my usual answer is that whenever we're not in a recession, we're heading toward one. The question is when. I believe we'll always have cycles, which means recessions and recoveries will always lie ahead. Does the fact that there's a recession ahead mean we should reduce our investments or alter our portfolio allocation? I don't think so. Since 1920, there have been 17 recessions as well as one Great Depression, a World War and several smaller wars, multiple periods of worry about global cataclysm, and now a pandemic. And yet, as I mentioned in my January memo, Selling Out, the S&amp;P 500 has returned about 10½% a year on average over that century-plus. Would investors have improved their performance by getting in and out of the market to avoid those problem spots... or would doing so have diminished it? Ever since I quoted Bill Miller in that memo, I've been impressed by his formulation that "it's time, not timing" that leads to real wealth accumulation. Thus, most investors would be better off ignoring short-term considerations if they want to enjoy the benefits of long-term compounding.

Two of the six tenets of Oaktree's investment philosophy say (a) we don't base our investment decisions on macro forecasts and (b) we're not market timers. I told the London audience our main goal is to buy debt or make loans that will be repaid and to buy interests in companies that will do well and make money. None of that has anything to do with the short term.

From time to time, when we consider it warranted, we do vary our balance between aggressiveness and defensiveness, primarily by altering the size of our closed-end funds, the pace at which we invest, and the level of risk we'll accept. But we do these things on the basis of current market conditions, not expectations regarding future events.

Everyone at Oaktree has opinions on the short-run phenomena mentioned above. We just don't bet heavily that they're right. During our latest meetings with clients in London, Bruce Karsh and I spent a lot of time discussing the significance of the short-term concerns. Here's how he followed up in a note to me:

...Will things be as bad or worse or better than expected? Unknowable... and equally unknowable how much is priced in, i.e. what the market is truly expecting. One would think a recession is priced in, but many analysts say that's not the case. This stuff is hard...!!!

Bruce's comment highlights another weakness of having a short-term focus. Even if we think we know what's in store in terms of things like inflation, recessions, and interest rates, there's absolutely no way to know how market prices comport with those expectations. This is more significant than most people realize. If you've developed opinions regarding the issues of the day, or have access to those of pundits you respect, take a look at any asset and ask yourself whether it's priced rich, cheap, or fair in light of those views. That's what matters when you're pursuing investments that are reasonably priced.

The possibility - or even the fact - that a negative event lies ahead isn't in itself a reason to reduce risk; investors should only do so if the event lies ahead and it isn't appropriately reflected in asset prices. But, as Bruce says, there's usually no way to know.

At the beginning of my career, we thought in terms of investing in a stock for five or six years; something held for less than a year was considered a short-term trade. One of the biggest changes I've witnessed since then is the incredible shortening of time horizons. Money managers know their returns in real-time, and many clients are fixated on how their managers did in the most latest quarter.

No strategy - and no level of brilliance - will make every quarter or every year a successful one. Strategies become more or less effective as the environment changes and their popularity waxes and wanes. In fact, highly disciplined managers who hold most rigorously to a given approach will tend to report the worst performance when that approach goes out of favor. Regardless of the appropriateness of a strategy and the quality of investment decisions, every portfolio and every manager will experience good and bad quarters and years that have no lasting impact and say nothing about the manager's ability. Often this poor performance will be due to unforeseen and unforeseeable developments.

Thus, what does it mean that someone or something has performed poorly for a while? No one should fire managers or change strategies based on short-term results. Rather than taking capital away from underperformers, clients should consider increasing their allocations in the spirit of contrarianism (but few do). I find it incredibly simple: If you wait at a bus stop long enough, you're guaranteed to catch a bus, but if you run from bus stop to bus stop, you may never catch a bus.

I believe most investors have their eye on the wrong ball. One quarter's or one year's performance is meaningless at best and a harmful distraction at worst. But most investment committees still spend the first hour of every meeting discussing returns in the most latest quarter and the year to date. If everyone else is focusing on something that doesn't matter and ignoring the thing that does, investors can profitably diverge from the pack by blocking out short-term concerns and maintaining a laser focus on long-term capital deployment.

A final quote from Pioneering Portfolio Management does a great job of summing up how institutions can pursue the superior performance most want. (Its concepts are also relevant to individuals):

Appropriate investment procedures contribute significantly to investment success, allowing investors to pursue profitable long-term contrarian investment positions. By reducing pressures to produce in the short run, liberated managers gain the freedom to create portfolios positioned to take advantage of opportunities created by short-term players. By encouraging managers to make potentially embarrassing out-of-favor investments, fiduciaries increase the likelihood of investment success.

Oaktree is probably in the extreme minority in its relative indifference to macro projections, especially regarding the short term. Most investors fuss over expectations regarding short-term phenomena, but I wonder whether they actually do much about their concerns and whether it helps.

Many investors - and especially institutions such as pension funds, endowments, insurance companies, and sovereign wealth funds, all of which are relatively insulated from the risk of sudden withdrawals - have the luxury of being able to focus exclusively on the long term... if they will take advantage of it. Thus, my suggestion to you is to depart from the investment crowd, with its unhelpful preoccupation with the short term, and to instead join us in focusing on the things that really matter.

Original Post

Editor's Note: The summary bullets for this article were chosen by Seeking Alpha editors.

Thu, 28 Jul 2022 02:10:00 -0500 en text/html https://seekingalpha.com/article/4526834-latest-memo-from-howard-marks-i-beg-to-differ
Killexams : Navigating the Ins and Outs of a Microservice Architecture (MSA)

Key takeaways

  • MSA is not a completely new concept, it is about doing SOA correct by utilizing modern technology advancements.
  • Microservices only address a small portion of the bigger picture - architects need to look at MSA as an architecture practice and implement it to make it enterprise-ready.
  • Micro is not only about the size, it is primarily about the scope.
  • Integration is a key aspect of MSA that can implement as micro-integrations when applicable.
  • An iterative approach helps an organization to move from its current state to a complete MSA.

Enterprises today contain a mix of services, legacy applications, and data, which are topped by a range of consumer channels, including desktop, web and mobile applications. But too often, there is a disconnect due to the absence of a properly created and systematically governed integration layer, which is required to enable business functions via these consumer channels. The majority of enterprises are battling this challenge by implementing a service-oriented architecture (SOA) where application components provide loosely-coupled services to other components via a communication protocol over a network. Eventually, the intention is to embrace a microservice architecture (MSA) to be more agile and scalable. While not fully ready to adopt an MSA just yet, these organizations are architecting and implementing enterprise application and service platforms that will enable them to progressively move toward an MSA.

In fact, Gartner predicts that by 2017 over 20% of large organizations will deploy self-contained microservices to increase agility and scalability, and it's happening already. MSA is increasingly becoming an important way to deliver efficient functionality. It serves to untangle the complications that arise with the creation services; incorporation of legacy applications and databases; and development of web apps, mobile apps, or any consumer-based applications.

Today, enterprises are moving toward a clean SOA and embracing the concept of an MSA within a SOA. Possibly the biggest draws are the componentization and single function offered by these microservices that make it possible to deploy the component rapidly as well as scale it as needed. It isn't a novel concept though.

For instance, in 2011, a service platform in the healthcare space started a new strategy where whenever it wrote a new service, it would spin up a new application server to support the service deployment. So, it's a practice that came from the DevOps side that created an environment with less dependencies between services and ensured a minimum impact to the rest of the systems in the event of some sort of maintenance. As a result, the services were running over 80 servers. It was, in fact, very basic since there were no proper DevOps tools available as there are today; instead, they were using Shell scripts and Maven-type tools to build servers.

While microservices are important, it's just one aspect of the bigger picture. It's clear that an organization cannot leverage the full benefits of microservices on their own. The inclusion of MSA and incorporation of best practices when designing microservices is key to building an environment that fosters innovation and enables the rapid creation of business capabilities. That's the real value add.

Addressing Implementation Challenges

The generally accepted practice when building your MSA is to focus on how you would scope out a service that provides a single-function rather than the size. The inner architecture typically addresses the implementation of the microservices themselves. The outer architecture covers the platform capabilities that are required to ensure connectivity, flexibility and scalability when developing and deploying your microservices. To this end, enterprise middleware plays a key role when crafting both your inner and outer architectures of the MSA.

First, middleware technology should be DevOps-friendly, contain high-performance functionality, and support key service standards. Moreover, it must support a few design fundamentals, such as an iterative architecture, and be easily pluggable, which in turn will provide rapid application development with continuous release. On top of these, a comprehensive data analytics layer is critical for supporting a design for failure.

The biggest mistake enterprises often make when implementing an MSA is to completely throw away established SOA approaches and replace them with the theory behind microservices. This results in an incomplete architecture and introduces redundancies. The smarter approach is to consider an MSA as a layered system that includes an enterprise service bus (ESB) like functionality to handle all integration-related functions. This will also act as a mediation layer that enables changes to occur at this level, which can then be applied to all relevant microservices. In other words, an ESB or similar mediation engine enables a gradual move toward an MSA by providing the required connectivity to merge legacy data and services into microservices. This approach is also important for incorporating some fundamental rules by launching the microservice first and then exposing it via an API.

Scoping Out and Designing the 'Inner Architecture'

Significantly, the inner architecture needs to be simple, so it can be easily and independently deployable and independently disposable. Disposability is required in the event that the microservice fails or a better service emerges; in either case, there is a requirement for the respective microservice to be easily disposed. The microservice also needs to be well supported by the deployment architecture and the operational environment in which the microservice is built, deployed, and executed. Therefore, it needs to be simple enough to be independently deployable. An ideal example of this would be releasing a new version of the same service to introduce bug fixes, include new features or enhancements to existing features, and to remove deprecated services.

The key requirements of an MSA inner architecture are determined by the framework on which the MSA is built. Throughput, latency, and low resource usage (memory and CPU cycles) are among the key requirements that need to be taken into consideration. A good microservice framework typically will build on lightweight, fast runtime, and modern programming models, such as an annotated meta-configuration that's independent from the core business logic. Additionally, it should offer the ability to secure microservices using desired industry leading security standards, as well as some metrics to monitor the behavior of microservices.

With the inner architecture, the implementation of each microservice is relatively simple compared to the outer architecture. A good service design will ensure that six factors have been considered when scoping out and designing the inner architecture:

First, the microservice should have a single purpose and single responsibility, and the service itself should be delivered as a self-contained unit of deployment that can create multiple instances at the runtime for scale.

Second, the microservice should have the ability to adopt an architecture that's best suited for the capabilities it delivers and one that uses the appropriate technology.

Third, once the monolithic services are broken down into microservices, each microservice or set of microservices should have the ability to be exposed as APIs. However, within the internal implementation, the service could adopt any suitable technology to deliver that respective business capability by implementing the business requirement. To do this, the enterprise may want to consider something like Swagger to define the API specification or API definition of a particular microservice, and the microservice can use this as the point of interaction. This is referred to as an API-first approach in microservice development.

Fourth, with units of deployment, there may be options, such as self-contained deployable artifacts bundled in hypervisor-based images, or container images, which are generally the more popular option.

Fifth, the enterprise needs to leverage analytics to refine the microservice, as well as to provision for recovery in the event the service fails. To this end, the enterprise can incorporate the use of metrics and monitoring to support this evolutionary aspect of the microservice.

Sixth, even though the microservice paradigm itself enables the enterprise to have multiple or polyglot implementations for its microservices, the use of best practices and standards is essential for maintaining consistency and ensuring that the solution follows common enterprise architecture principles. This is not to say that polyglot opportunities should not be completely vetoed; rather they need to be governed when used.

Addressing Platform Capabilities with the 'Outer Architecture'

Once the inner architecture has been set up, architects need to focus on the functionality that makes up the outer architecture of their MSA. A key component of the outer architecture is the introduction of an enterprise service bus (ESB) or similar mediation engine that will aide with the connecting legacy data and services into MSA. A mediation layer will also enable the enterprise to maintain its own standards while others in the ecosystem manage theirs.

The use of a service registry will support dependency management, impact analysis, and discovery of the microservices and APIs. It also will enable streamlining of service/API composition and wire microservices into a service broker or hub. Any MSA should also support the creation of RESTful APIs that will help the enterprise to customize resource models and application logic when developing apps.

By sticking to the basics of designing the API first, implementing the microservice, and then exposing it via the API, the API rather than the microservice becomes consumable. Another common requirement enterprises need to address is securing microservices. In a typical monolithic application, an enterprise would use an underlying repository or user store to populate the required information from the security layer of the old architecture. In an MSA, an enterprise can leverage widely-adopted API security standards, such as OAuth2 and OpenID Connect, to implement a security layer for edge components, including APIs within the MSA.

On top of all these capabilities, what really helps to untangle MSA complexities is the use of an underlying enterprise-class platform that provides rich functionality while managing scalability, availability, and performance. That is because the breaking down of a monolithic application into microservices doesn't necessarily amount to a simplified environment or service. To be sure, at the application level, an enterprise essentially is dealing with several microservices that are far more simple than a single monolithic, complicated application. Yet, the architecture as a whole may not necessarily be less arduous.

In fact, the complexity of an MSA can be even greater given the need to consider the other aspects that come into play when microservices need to talk to each other versus simply making a direct call within a single process. What this essentially means is that the complexity of the system moves to what is referred to as the "outer architecture", which typically consists of an API gateway, service routing, discovery, message channel, and dependency management.

With the inner architecture now extremely simplified--containing only the foundation and execution runtime that would be used to build a microservice--architects will find that the MSA now has a clean services layer. More focus then needs to be directed toward the outer architecture to address the prevailing complexities that have arisen. There are some common pragmatic scenarios that need to be addressed as explained in the diagram below.

The outer architecture will require an API gateway to help it expose business APIs internally and externally. Typically, an API management platform will be used for this aspect of the outer architecture. This is essential for exposing MSA-based services to consumers who are building end-user applications, such as web apps, mobile apps, and IoT solutions.

Once the microservices are in place, there will be some sort of service routing that takes place in which the request that comes via APIs will be routed to the relevant service cluster or service pod. Within microservices themselves, there will be multiple instances to scale based on the load. Therefore, there's a requirement to carry out some form of load balancing as well.

Additionally, there will be dependencies between microservices--for instance, if microservice A has a dependency on microservice B, it will need to invoke microservice B at runtime. A service registry addresses this need by enabling services to discover the endpoints. The service registry will also manage the API and service dependencies as well as other assets, including policies.

Next, the MSA outer architecture needs some messaging channels, which essentially form the layer that enables interactions within services and links the MSA to the legacy world. In addition, this layer helps to build a communication (micro-integration) channel between microservices, and these channels should be lightweight protocols, such as HTTP, MQTT, among others.

When microservices talk to each other, there needs to be some form of authentication and authorization. With monolithic apps, this wasn't necessary because there was a direct in-process call. By contrast, with microservices, these translate to network calls. Finally, diagnostics and monitoring are key aspects that need to be considered to figure out the load type handled by each microservice. This will help the enterprise to scale up microservices separately.

Reviewing MSA Scenarios

To put things into perspective, let's analyze some actual scenarios that demonstrate how the inner and outer architecture of an MSA work together. We'll assume an organization has implemented its services using Microsoft Windows Communication Foundation or the Java JEE/J2EE service framework, and developers there are writing new services using a new microservices framework by applying the fundamentals of MSA.

In such a case, the existing services that expose the data and business functionality cannot be ignored. As a result, new microservices will need to communicate with the existing service platforms. In most cases, these existing services will use the standards adhered to by the framework. For instance, old services might use service bindings, such as SOAP over HTTP, Java Message Service (JMS) or IBM MQ, and secured using Kerberos or WS-Security. In this example, messaging channels too will play a big role in protocol conversions, message mediation, and security bridging from the old world to the new MSA.

Another aspect the organization would need to consider is any impact to its scalability efforts in terms of business growth given the prevalent limitations posed by a monolithic application, whereas an MSA is horizontally scalable. Among some obvious limitations are possible errors as it's cumbersome to test new features in a monolithic environment and delays to implement these changes, hampering the need to meet immediate requirements. Another challenge would be supporting this monolithic code base given the absence of a clear owner; in the case of microservices, individual or single functions can be managed on their own and each of these can be expanded as required quickly without impacting other functions.

In conclusion, while microservices offer significant benefits to an organization, adopting an MSA in a phased out or iterative manner may be the best way to move forward to ensure a smooth transition. Key aspects that make MSA the preferred service-oriented approach is clear ownership and the fact that it fosters failure isolation, thereby enabling these owners to make services within their domains more stable and efficient.

About the Author

Asanka Abeysinghe is vice president of solutions architecture at WSO2. He has over 15 years of industry experience, which include implementing projects ranging from desktop and web applications through to highly scalable distributed systems and SOAs in the financial domain, mobile platforms, and business integration solutions. His areas of specialization include application architecture and development using Java technologies, C/C++ on Linux and Windows platforms. He is also a committer of the Apache Software Foundation.

Mon, 26 Dec 2016 20:00:00 -0600 en text/html https://www.infoq.com/articles/navigating-microservices-architecture/
Killexams : Embedded analytics emerges to offer new level of business intelligence

Business analytics is an increasingly powerful tool for organisations, but one that is associated with steep learning curves and significant investments in infrastructure.

The idea of using data to drive better decision-making is well established. But the conventional approach – centred around reporting and analysis tools – relies on specialist applications and highly trained staff. Often, firms find they have to build teams of data scientists to gather the data and manage the tools, and to build queries.

This creates bottlenecks in the flow of information, as business units rely on specialist teams to interrogate the data, and to report back. Even though reporting tools have improved dramatically over the past decade, with a move from spreadsheets to visual dashboards, there is still too much distance between the data and the decision-makers.

Companies and organistions also face dealing with myriad data sources. A study from IDC found that close to four in five firms used more than 100 data sources and just under one-third had more than 1,000. Often, this data exists in silos.

As a result, suppliers have developed embedded analytics to bring users closer to the data and, hopefully, lead to faster and more accurate decision-making. Suppliers in the space include ThoughtSpot, Qlik and Tableau, but business intelligence (BI) and data stalwarts such as Informatica, SAS, IBM and Microsoft also have relevant capabilities.

Embedded analytics adds functionality into existing enterprise software and web applications. That way, users no longer need to swap into another application – typically a dashboard or even a BI tool itself – to look at data. Instead, analytics suppliers provide application programming interfaces (APIs) to link their tools to the host application.

Embedded analytics can be used to supply mobile and remote workers access to decision support information, and even potentially data, on the move. This goes beyond simple alerting tools: systems with embedded analytics built in allow users to see visualisations and to drill down into live data.

And the technology is even being used to provide context-aware information to consumers. Google, for example, uses analytics to present information about how busy a location or service will be, based on variables such as the time of day.

Indeed, some suppliers describe embedded analytics as a “Google for business” because it allows users to access data without technical know-how or an understanding of analytical queries.

“My definition generally is having analytics available in the system,” says Adam Mayer, technical product director at Qlik. “That’s not your dedicated kind of BI tool, but more to the point, I think it’s when you don’t realise that you’re analysing data. It’s just there.”

The trend towards embedding analytics into other applications or web services reflects the reality that there are many more people in enterprises who could benefit from the insights offered by BI than there are users of conventional BI systems.

Firms also want to Excellerate their return on investment in data collection and storage by giving more of the business access to the information they hold. And with the growth of machine learning and artificial intelligence (AI), some of the heavy lifting associated with querying data is being automated.

“What we are trying to do is supply non-technical users the ability to engage with data,” says Damien Brophy, VP for Europe, the Middle East and Africa (EMEA) at ThoughtSpot. “We’re bringing that consumer-like, Google-like experience to enterprise data. It is giving thousands of people access to data, as opposed to five or 10 analysts in the business who then produce content for the rest of the business.”

At one level, embedded analytics stands to replace static reports and potentially dashboards too, without the need to switch applications. That way, an HR or supply chain specialist can view and – to a degree – query data from within their HR or enterprise resource planning (ERP) system, for example.

A field service engineer could use an embedded analysis module within a maintenance application to run basic “what if” queries, to check whether it is better to replace a part now or carry out a minor repair and do a full replacement later.

Embedded analytics to help decision-making

Also, customer service agents are using embedded analytics to help with decision-making and to tailor offers to customers.

Embedded systems are designed to work with live data and even data streams, even where users do not need to drill down into the data. Enterprises are likely to use the same data to drive multiple analysis tools: the analytics, business development or finance teams will use their own tools to carry out complex queries, and a field service or customer service agent might need little more than a red or green traffic light on their screen.

“The basic idea is that every time your traditional reporting process finds the root cause of a business problem, you train your software, either by formal if-then-else rules or via machine learning, to alert you the next time a similar situation is about to arise,” says Duncan Jones, VP and principal analyst at Forrester.

“For instance, suppose you need to investigate suppliers that are late delivering important items. In the old approach, you would create reports about supplier performance, with on-time-delivery KPI and trends and you’d pore through it looking for poor performers.

“The new approach is to create that as a view within your home screen or dashboard, continually alerting you to the worst performers or rapidly deteriorating ones, and triggering a formal workflow for you to record the actions you’ve taken – such as to contact that supplier to find out what it is doing to fix its problems.”

This type of alerting helps businesses, because it speeds up the decision-making process by providing better access to data that the organisation already holds.

“It’s partly businesses’ need to move faster, to react more quickly to issues,” says Jones. “It’s also evolution of the technology to make embedded alert-up analytics easier to deliver.”

Embedded analytics suppliers are also taking advantage of the trend for businesses to store more of their data in the cloud, making it easier to link to multiple applications via APIs. Some are going a step further and offering analytical services too: a firm might no longer need expertise in BI, as the supplier can offer its own analytical capabilities.

Again, this could be via the cloud, but serving the results back to the users in their own application. And it could even go further by allowing different users to analyse data in their own workflow-native applications.

A “smart” medical device, such as an asthma inhaler, could provide an individual’s clinical data to their doctor, but anonymised and aggregated data to the manufacturer to allow them to plan drug manufacturing capacity better.

“Data now is changing so quickly, you really need intraday reporting,” says Lee Howells, an analytics specialist at PA Consulting. “If we can put that in on a portal and allow people to see it as it happened, or interact with it, they are then able to drill down on it.

“It’s putting that data where employees can use it and those employees can be anyone from the CEO to people on operations.”

But if the advantage of embedded analytics lies in its ability to tailor data to the users’ roles and day-to-day applications, it still relies on the fundamentals of robust BI systems.

Firms considering embedded analytics need to look at data quality, data protection and data governance.

They also need to pay attention to security and privacy: the central data warehouse or data lake might have robust security controls, but does the application connecting via an API? Client software embedding the data should have equal security levels.

Cleaner data is critical

And, although cleaning data is always important for effective analytics and business intelligence, it becomes all the more critical when the users are not data scientists. They need to know that they can trust the data, and if the data is imperfect or incomplete, this needs to be flagged.

A data scientist working on an analytics team will have an instinctive feel for data quality and reliability, and will understand that data need not be 100% complete to Excellerate decision-making. But a user in the field, or a senior manager, might not.

“Embedded analytics continues the democratisation of data, bringing data and insight directly to the business user within their natural workflow,” says Greg Hanson, VP for EMEA at Informatica.

“This fosters a culture of data-driven decision-making and can speed time to value. However, for CDOs [chief data officers] and CIOs, the crucial question must be: ‘is it accurate, is it trustworthy and can I rely on it?’ For embedded analytics programmes to be a success, organisations need confidence that the data fuelling them is from the right sources, is high quality and the lineage is understood.”

CDOs should also consider starting small and scaling up. The usefulness of real-time data will vary from workflow to workflow. Some suppliers’ APIs will integrate better with the host application than others. And users will need time to become comfortable making decisions based on the data they see, but also to develop a feel for when questions are better passed on to the analytics or data science team.

“Organisations, as part of their next step forward, have come to us with their cloud infrastructure or data lakes already in place, and they started to transform their data engineering into something that can be used,” says PA’s Howell. “Sometimes they put several small use cases in place as proof of concept and the proof of value. Some data isn’t as well used as it could be. I think that’s going to be a continually evolving capability.”

Sat, 16 Jul 2022 16:15:00 -0500 en text/html https://www.computerweekly.com/feature/Embedded-analytics-emerges-to-offer-new-level-of-business-intelligence
Killexams : Communication, Media and Design

The Communication, Media and Design undergraduate bachelor's degree program encompasses 36 of the 120 credit hours required for a bachelor's degree.
This provides you with the opportunity to pursue multiple majors, minors or concentrations while working toward your Communication, Media and Design degree. All courses are 3 credits unless noted.

Clarkson Common Experience
The following courses are required for all students, irrespective of their program of study. These courses are offered during the fall semester, with FY100 First-Year Seminar being required of only first-year students. Both FY100 and UNIV190 are typically taken during the fall semester of the first year at Clarkson.

  • FY100  First-Year Seminar (1 credit)
  • UNIV190 The Clarkson Seminar (3 credits)

Communication, Media and Design Core Requirements
Students majoring in Communication, Media and Design are required to complete the following courses:

  • COMM210 Theory of Rhetoric for Business, Science &amp; Engineering
  • COMM217 Introduction to Public Speaking
  • COMM490 Senior Communication Internship
  • COMM499 Communication Professional Experience

Communication, Media and Design Core Electives
The following are electives students are required to complete for the Communication, Media and Design major.

300-Level Communication, Media and Design Course:
Students must complete one Communication, Media and Design course at the 300-level from the following:

  • COMM312 Public Relations
  • COMM313 Professional Communications
  • COMM314 Placemaking, Marketing and Promotion
  • COMM330 Science Journalism

400-Level Communication, Media and Design Course:
Students must complete one Communication, Media and Design course at the 400-level from the following:

  • COMM410 Theory &amp; Philosophy of Communication
  • COMM412 Organizational Communications &amp; Public Relations Theory
  • COMM428 Environmental Communication

Courses with Technology Expertise:
Students must complete at least 6 credits with information technology expertise.

Mathematics/Statistics Electives:
Students must complete at least 6 credits from the mathematics (MA) and/or statistics (STAT) subject areas.

Science Electives:
Students must complete at least 6 credits, including a lab course, from the biology (BY), chemistry (CM), and/or physics (PH) subject areas.

Knowledge Area/University Course Electives:
Students majoring in communication will have approximately 42 credit hours available to use toward Knowledge Area and/or University Course electives.

Free Electives:
Students majoring in communication will have approximately 42 credit hours available to use toward courses of their choice.

Communication, Media and Design electives (21 credits) chosen from the following:

  • COMM100 2D Digital Design
  • COMM217 Introduction to Public Speaking
  • COMM219 Introduction Social Media
  • COMM226 Short Film Screenwriting
  • COMM229 Principles of User-Experience Design
  • COMM245 Writing for New Media
  • COMM310 Mass Media &amp; Society
  • COMM322 Typography &amp; Design
  • COMM326 Feature Film Screenwriting
  • COMM327 Digital Video Production I
  • COMM329 Front-End Development for the Web
  • COMM345 Information Design
  • COMM360 Sound Design
  • COMM391-395 Special Topics
  • COMM420-425 Communication: Independent Study (1-9 credits)
  • COMM427 Digital Video Production II
  • COMM429 Full-stack Development
  • COMM447 Design-Driven Innovation
  • COMM448 Portraying Innovation Through the Lens
  • COMM449 Narrating Innovation
  • COMM450 Leading Innovation
  • COMM470 Communication Internship
  • COMM480 Undergraduate Teaching Assistantship in Communication &amp; Media (1-3 credits)
Thu, 26 May 2022 08:40:00 -0500 en text/html https://www.clarkson.edu/undergraduate/communication-media-and-design
Killexams : Ministry of Education launches high-tech summer camp

- Advertisement -

By Carlena Knight

[email protected]

More than 200 students from the primary school to tertiary level will take part in a summer camp put on by the Ministry of Education and the University of West Indies Five Islands.

The camp will focus on six core areas – fundamentals of drones and drone technology, robotics, augmented reality, e-sports, photography and graphic design, and mobile app and web development – all of which are linked to science, technology, engineering, and mathematics, known as STEM subjects.

The four-week event was launched on Friday with top education officials in attendance.

Education Minister Daryll Matthew welcomed the programme and shared his hope that it will be rolled out across the nation’s schools.

“It is expected to be an annual event growing in leaps and bounds every year,” Matthew said.

“There are some countries where these STEM fairs are a major part of the educational offering of the country and we see no reason why Antigua and Barbuda, which is perfectly positioned in the Caribbean, having a university campus with a School of Computing and Artificial Intelligence, having an institution such as ABIIT on our shores, having access to fibre-to-the-home fast internet to each and every household, there is no reason why this ought not to be the next logical step in the offering of the Ministry of Education and Sports to our people of Antigua and Barbuda, giving our young people in particular the opportunity to really learn everything the worlds holds for them where technology is concerned,” Matthew added.

Director of Education Clare Browne spoke of the advantages the camp presents to local youngsters.

“STEM education, STEM skills, now and in the very near future, will be in every, every field of endeavour if they are not already there and if these skills and knowledge are not already a required field of endeavour.

“The word out there is that 75 percent of the jobs that children and young people will have in the future will be connected to the STEM field and so it is important for a Ministry of Education and Sports to ensure that we prepare our young people for jobs that are not even invented yet, for opportunities that are not yet in place and in play,” Browne explained.

“This STEM fair is designed to stimulate that kind of interest in those areas, those skills that are going to be required,” Browne added.

The free camp will run from July 18-29. It will break for Carnival and return from August 8-19.

It will be broken up into three age groups: children aged four to 11, youngsters aged 12 to 17 years, and tertiary level students over 17.

The camp will have both a virtual and hands-on approach. Students engaged at the UWI Five Islands campus will learn the basics of computer assembly, coding, drones’ programmes and graphic design, with some having the opportunity to earn an IBM certification.

Registration is now closed.

- Advertisement -

Fri, 15 Jul 2022 13:40:00 -0500 en-GB text/html https://antiguaobserver.com/ministry-of-education-launches-high-tech-summer-camp/
000-595 exam dump and training guide direct download
Training Exams List