Real Questions and latest syllabus of LOT-402 exam

From, all of us give valid or even more today IBM LOT-402 PDF Questions that are accessible for download within just one click on. We tend not to recommend wasting your own time on totally free stuff. Just indication-up for the little payment plus obtain your Administering IBM Connections 4.0 PDF Questions download account ready within a short while. Study our LOT-402 Real Exam Questions and complete your exam.

Exam Code: LOT-402 Practice test 2022 by team
Administering IBM Connections 4.0
IBM Administering Questions and Answers
Killexams : IBM Administering Braindumps - BingNews Search results Killexams : IBM Administering Braindumps - BingNews Killexams : Terminal Velocity

Back in the early days of business computing, operators accessed the back-end system – such as a mainframe or minicomputer – via a green screen terminal. Terminals were disposable devices. No one ever said, “Oh I wish I had a faster VT100 [terminal].” They had the terminal comprising a keyboard and a screen and used it as the user interface to access software running on the back-end system.

The Commodore Pet, Apple II and Tandy TRS-80, released in 1977, directly led to the development of the IBM PC and the desktop computing revolution. The rest, as they say, is history. Terminal-based computing was relegated to the recycle bin and after the introduction of the Apple Mac in 1984 and later Windows, the command line user interface, as used on terminals, was effectively killed off.

SaaS as terminal-based computing

What has this got to do with modern desktop computing? While no one – apart from a certain genre of system admin – requires a command line user interface, we are all using a modern version of the terminal each and every day. Browser-based computing is nothing more than a user interface on top of a back-end web-server application. The dominance of SaaS in enterprise IT has demonstrated the effectiveness of this deployment model.

There is no need to install software images on physical PC hardware. Everything runs in the cloud and can be accessed through a browser. The Mac versus PC or iOS versus Android debate goes away too, as the software is generally compatible across different web browsers. So, with the ability to run on any hardware, there should be very little need to purchase new devices. In fact, end user devices are just like the terminals in the 1970s. They effectively provide access to a suite of software, running on back-end systems.

In a recent chat with Computer Weekly, Gartner analyst, Annette Zimmerman, said that only 25% of a device’s total carbon footprint is produced during its useful life. Given that 50 million business PCs ship each year and each one produces 350 grams of CO2 emissions, anything business IT can do to extend the upgrade cycle, has a direct impact on global warming. Increasing a laptop’s life from three to four years also means that the acquisition cost of the hardware is extended by a third. That’s 33% more useful life, with the acquisition cost spread over four years.

It’s good news for those who control the budget and demonstrates how IT can contribute to helping prevent global warming from rising by more than 1.5 °C. The big question then becomes whether employees are prepared to use old hardware. Perhaps there are still lessons society can learn from terminal-based computing.

Thu, 23 Jun 2022 02:13:00 -0500 en text/html
Killexams : SAS Certification Guide: Overview and Career Paths

What first began as an agricultural research project at North Carolina State University eventually grew into a full-fledged software and services company by 1976. SAS has gone on to develop a solid customer base in the banking and pharmaceutical industries as well as in academia and at numerous agencies at all levels of government. Today, SAS is a leader in business analytics, data warehousing and data mining.

SAS has been recognized as one of the best places to work by organizations like Fortune and the Great Place to Work Institute. Coming in at No. 37, SAS made its 21st appearance in Fortune’s list of 100 best companies to work for in 2017 and No. 23 in Fortune’s list of top 100 best workplaces for millennials in 2018. Indeed, the company’s low turnover rate (4 percent in 2016) is an indicator of its commitment to its employees and, indirectly, to its customers as well.

Of the top 100 Fortune Global 500 companies, 96 are SAS customers. SAS customers span the globe with more than 83,000 instances installed in 144 countries.

SAS certification program overview

SAS has awarded more than 100,000 certifications since the program’s introduction in 1999, according to Brightcove. Today, the SAS Global Certification Program offers 23 credentials across seven categories:

  • Foundation Tools
  • Advanced Analytics
  • Business Intelligence and Analytics
  • Data Management
  • Administration
  • JMP
  • Partners

SAS certifications, along with required exams and costs, are described in more detail in the following sections. Although experience levels aren’t specifically indicated for each certification, a good rule of thumb is a minimum of eight months of experience on the base SAS system for Base Programmers and two to three years of relevant, hands-on experience for all other certifications before candidates tackle their respective exams. 

All exams are administered by Pearson VUE or through a SAS-sponsored certification test session (typically in conjunction with a training course). SAS also offers online proctored exams for all certification credentials through their partnership with Pearson VUE. Note that all certifications covered below are based on SAS 9.4.

Foundation Tools credentials aim at SAS professionals whose workdays revolve around writing and managing SAS programs. The company currently offers three Foundation Tools certifications:

  • SAS Certified Base Programmer for SAS 9: The Base Programmer knows how to query databases and perform analyses, including importing and exporting raw data files, manipulating data, combining SAS data sets and creating reports. The certification test features 50 to 55 multiple-choice questions, a 100-minute time limit, and candidates must answer at least a 70 percent of the questions correctly to pass the exam.
  • SAS Certified Advanced Programmer for SAS 9: The Advanced Programmer must be skilled in writing advanced DATA step programming statements and solving complex problems as well as interpreting SAS SQL code and writing SAS macros. The certification test features 60 to 65 multiple-choice questions, a 100-minute time limit and candidates must answer at least 65 percent of the questions correctly to pass the exam.
  • SAS Certified Clinical Trials Programmer Using SAS 9: This programmer works exclusively with clinical trials data, transforming raw data into polished, validated reports. There are two different paths to achieving this certification:
    • Clinical Trial Programmer Exam: The certification test features 90 to100 multiple-choice questions, a three-hour time limit and candidates must answer at least a 70 percent of the questions correctly to pass the exam. There are no prerequisites required to take this exam.
    • Clinical Trial Programmer – Accelerated Exam: The certification test features 70 to 75 multiple-choice questions, a 120-minute time limit, and candidates must answer at least a 70 percent of the questions correctly to pass the exam. Candidates must already have completed the SAS Base Programmer certification before they can sit for this accelerated exam.


Required Exam(s)

Exam Cost*

SAS Certified Base Programmer for SAS 9

SAS Base Programming for SAS 9 test (A00-211)


SAS Certified Advanced Programmer for SAS 9

SAS Advanced Programming for SAS 9 test (A00-212)

Must possess SAS Certified Base Programmer for SAS 9 credential


SAS Certified Clinical Trials Programmer Using SAS 9

Clinical Trials Programming Using SAS 9 test (A00-280)

Clinical Trials Programming Using SAS 9 – Accelerated Version test (A00-281)

Requires the SAS Certified Base Programmer for SAS 9 credential

$180 each exam

* SAS offers a 50 percent discount on the cost of all exams to instructors, students, faculty and staff at schools and higher education institutions as well as to SAS employees. See the SAS FAQs for details.

SAS Advanced Analytics certifications

The Advanced Analytics credentials are designed for SAS professionals who gather, manipulate and analyze big data using SAS tools, run reports and make business recommendations based on complex models. The certifications in this category include:

  • SAS Certified Data Scientist Using SAS 9: Data scientists can use both open source and SAS tools to manipulate data, form conclusions and make recommendations, and deploy SAS solutions. Rather than being a stand-alone certification, the Certified Data Scientist credential is attained by completing the Certified Big Data Professional and Certified Advanced Analytics Professional credentials.
  • SAS Certified Advanced Analytics Professional Using SAS 9: An Advanced Analytics Professional uses predictive modeling and statistical analysis methods and techniques to analyze big data sets. Candidates must pass three exams to complete this certification:
    • Predictive Monitoring Using SAS Enterprise Miner 7, 13 or 14
    • Advanced Predictive Modeling
    • Text Analytics, Time Series, Experimentation and Optimization
  • SAS Certified Predictive Modeler Using SAS Enterprise Miner 14: A Predictive Modeler prepares data and then builds, assesses, and implements predictive models. The certification test features 55 to 60 multiple choice and short-answer questions, a 165-minute time limit, and candidates must achieve a score of at least 725 points from a possible point range of 200 to 1,000 points to pass the exam. Find out more about scaled scores via the test FAQs. There are no credential prerequisites required to take this exam, though a thorough knowledge of a number of related skills and techniques is required.
  • SAS Certified Statistical Business Analyst Using SAS 9 Regression and Modeling: The Statistical Business Analyst uses SAS software to perform statistical analyses and predictive modeling, including linear and logistic regression, variance analysis and model performance measurements. The certification test features 60 multiple-choice and short-answer questions, a two-hour time limit, and candidates must answer at least 68 percent of the questions correctly to pass the exam. There are no credential prerequisites required to take this exam, though a thorough knowledge of a number of related skills and techniques is required.


Required Exam(s)

Exam Cost

SAS Certified Data Scientist Using SAS 9

No test is required. Credential is awarded to candidates who possess the following two certifications:

SAS Certified Big Data Professional Using SAS 9 (2 exams, $180 each, $360 total)

SAS Certified Advanced Analytics Professional Using SAS 9 (3 exams, $610 total)


SAS Certified Advanced Analytics Professional Using SAS 9

Predictive Modeling Using SAS Enterprise Miner 13 (Candidates with SAS Certified Predictive Modeler Using SAS Enterprise Miner 7, 13 or 14 do not need to take this exam.), $250

SAS Advanced Predictive Modeling (A00-225), $180

SAS Text Analytics, Time Series, Experimentation and Optimization (A00-226), $180


SAS Certified Predictive Modeler Using SAS Enterprise Miner 14

Predictive Modeling using SAS Enterprise Miner 14 exam


SAS Certified Statistical Business Analyst Using SAS 9: Regression and Modeling

SAS Statistical Business Analysis Using SAS 9: Regression and Modeling test (A00-240)


SAS Business Intelligence and Analytics certifications

The Business Intelligence and Analytics credentials are designed for IT professionals who create interfaces and reports for SAS 9 or who use SAS Visual Analytics routinely. For the following certifications, there are no credential prerequisites required to take these exams, though a thorough knowledge of a number of related skills and techniques is required. The three certifications in this category are:

  • SAS Certified BI Content Developer for SAS 9: The BI Content Developer focuses mainly on creating, implementing and customizing SAS interface applications (such as reporting applications), including data management and dashboard creation. The certification test features 60 to 65 multiple-choice questions, a two-hour time limit, and candidates must answer at least 70 percent of the questions correctly to pass the exam.
  • SAS Certified Visual Business Analyst Exploration and Design Using SAS Visual Analytics: This credential targets analysts who use SAS Visual Analytics, Visual Analytics Explorer, and Visual Analytics Designer to add, manipulate and explore data, and create reports. The certification test features 60 to 65 multiple-choice questions, a 100-minute time limit, and candidates must answer at least 68 percent of the questions correctly to pass the exam.
  • SAS Certified Visual Modeler Using SAS Visual Statistics 7.4:  The Certified Visual Modeler targets analysts who perform model fitting and analysis as well as predictive and exploratory modeling in business environments using SAS Visual Statistics. The certification test features 50 multiple-choice, short-answer and interactive questions, a 90-minute time limit and candidates must answer at least 68 percent of the questions correctly to pass the exam.


Required Exam(s)

Exam Cost

SAS Certified BI Content Developer for SAS 9

SAS BI Content Development for SAS 9 test (A00-270)


SAS Certified Visual Business Analyst: Exploration and Design Using SAS Visual Analytics

SAS Visual Analytics 7.4 Exploration and Design test (A00-277)


SAS Certified Visual Modeler Using SAS Visual Statistics 7.4

SAS Interactive Model Building and Exploration Using SAS Visual Statistics 7.4 test (A00-272)


SAS Data Management certifications

Professionals whose workdays (or career aspirations) revolve more around managing data and platforms rather than deep statistics, analysis and modeling will gravitate to data management credentials. For the following certifications, there are no credential prerequisites required to take these exams, though a thorough knowledge of a number of related skills and techniques is required. The three certifications in this category are:

  • SAS Certified Big Data Professional Using SAS 9: The Big Data credential is designed for professionals who conduct statistical analyses using SAS and open source data-management tools. This credential is comprised of two separate exams:
    • Big Data Preparation, Statistics and Visual Exploration
    • SAS Big Data Programming and Loading
  • SAS Certified Data Integration Developer for SAS 9: The Data Integration Developer works with data to prepare it for analysis and reporting. Candidates must know how to define the platform architecture for SAS Business Analytics as well as create metadata, transform data and tables, and run jobs. The certification test features 76 multiple-choice questions, a 105-minute time limit, and candidates must answer at least 70 percent of the questions correctly to pass the exam.
  • SAS Certified Data Quality Steward for SAS 9: The Data Quality Steward validates the skills of professionals who use the SAS DataFlux Data Management Studio. The certification test features 75 multiple-choice questions, a 110-minute time limit, and candidates must answer at least 68 percent of the questions correctly to pass the exam.


Required Exam(s)

Exam Cost

SAS Certified Big Data Professional Using SAS 9

SAS Big Data Preparation, Statistics, and Visual Exploration test (A00-220)

SAS Big Data Programming and Loading test (A00-221)

$180 each, $360 total

SAS Certified Data Integration Developer for SAS 9

SAS Data Integration Development for SAS 9 test (A00-260)


SAS Certified Data Quality Steward for SAS 9

SAS Data Quality using DataFlux Data Management Studio test (A00-262)


SAS Administration certifications

The administration category has a single credential – the SAS Certified Platform Administrator for SAS 9 – designed for professionals responsible for supporting the SAS Business Analytics platform from installation through day-to-day maintenance. Candidates must know how to set up folders, manage user accounts, monitor system performance, apply security techniques, perform backups and complete other administrative tasks. The certification test features 70 multiple-choice questions, a 110-minute time limit, and candidates must answer at least 70 percent of the questions correctly to pass the exam.


Required Exam(s)

Exam Cost

SAS Certified Platform Administrator for SAS 9

SAS Platform Administration for SAS 9 test (A00-250)


JMP certifications

SAS JMP is data analysis and visualization software that allows users to explore, mine and share data analyses in a graphical format. The JMP credential includes two exams:

  • JMP Certified Specialist – JMP Scripting Using JMP 14: Using the JMP Scripting Language (JSL), candidates will demonstrate their proficiency in completing various programming and scripting tasks.
  • JMP Certified Specialist – Design and Analysis of Experiments Using JMP 14: Successful candidates will demonstrate their abilities and skills in designing and analyzing various industrial experiments, including evaluating model assumptions and process optimization.

Both of these certification exams feature 50 to 60 multiple-choice and short-answer questions, a 150-minute time limit, and candidates must achieve a score of at least 725 points from the possible point range of 200 to 1,000 points. Find out more about scaled scores via the test FAQ.


Required Exam(s)

Exam Cost

JMP Certified Specialist: JMP Scripting Using JMP 14

JMP Scripting Using JMP 14 test (A00-908)


JMP Certified Specialist: Design and Analysis of Experiments Using JMP 14

Design and Analysis of Experiments Using JMP 14 (A00-909)


Partner certifications

SAS offers credential programs for certified SAS resellers, VARs, and consultants though its partner program. There are six partner credentials available to SAS partners:

Access to the Partner credentialing portal is restricted to authorized SAS partners only. As a result, some details of the partner test process are hidden from public view. If you work for a SAS partner, ask your company SAS liaison or your SAS sales team for more details about partner certifications.

Why get SAS certified? More demonstrated knowledge typically means more earning power. A quick search for SAS programmer on major job posting sites returns hundreds, if not thousands, of open positions across the United States, most with starting salaries in a range from $52,000 to $126,000.

While salaries vary depending on industry, experience, certifications achieved and other factors, average SAS programmer earnings are almost $81,038 nationally, per SimplyHired. The average earnings in SAS statistical modeling careers are almost $110,000, topping out around $200,000.

Tip: To make the hiring process easier, employers can verify a candidate’s credentials by searching the SAS Global Certified Professional Directory.

Suffice it to say, a SAS certification can pay off quickly. Couple that with an outstanding company culture, achieving a SAS credential and working for SAS is a smart career move for anyone with programming chops and a desire for longevity with a single company.

SAS careers are plentiful and vary depending on your goals and career aspirations. Some SAS careers naturally gravitate toward one or more certification categories. Some examples of common job roles by certification category include:

  • Foundation Tools: Analysts, programmers and data managers
  • Advanced Analytics: Technical analysts, statisticians and data scientists
  • Business Intelligence and Analytics: Content developers and analysts
  • Data Management: Analysts, programmers and data managers
  • Administration: Platform administrators

Regardless of your area(s) of interest, you’ll find a plethora of SAS careers available.

Training and resources

SAS offers links to SAS classroom and eLearning courses, sample test questions and full practice exams. Refer to the test Preparation tab for each certification on the SAS Certification website. Candidates can purchase certification packages that include training courses, preparation materials and test vouchers with typical discounts of 35 to 40 percent.

SAS training can be pricey, depending on factors such as delivery method and class length.

Individual courses range from lows around $1,100 to highs of $4,000. Candidates should be sure to check out the SAS Discounts web page for information on current discount programs, best value deals, veteran’s discounts and more before enrolling.

The SAS Training and Books webpage provides links to certification prep books, training courses, eLearning opportunities (SAS onDemand) and the SAS Global Academic Program. SAS also offers sample test questions and training software may be accessed through the SAS University Edition.

Many colleges and universities, such as Philadelphia UniversityFlorida State University and the University of Missouri, to name just a few, also offer SAS certificate programs to their undergraduate and graduate students. If you’re in (or thinking of going to) getting SAS certified as part of your degree program, it pays to check out your SAS certification options before choosing an institution of higher learning.

Credit: Ed Tittel

Ed Tittel

Ed is a 30-year-plus veteran of the computing industry, who has worked as a programmer, a technical manager, a classroom instructor, a network consultant and a technical evangelist for companies that include Burroughs, Schlumberger, Novell, IBM/Tivoli and NetQoS. He has written for numerous publications, including Tom’s IT Pro, and is the author of more than 100 computing books on information security, web markup languages and development tools, and Windows operating systems.

Credit: Earl Follis

Earl Follis
Earl is also a 30-year veteran of the computer industry, who worked in IT training, marketing, technical evangelism and market analysis in the areas of networking and systems technology and management. Ed and Earl met in the late 1980s when Ed hired Earl as a trainer at an Austin-area networking company that’s now part of HP. The two of them have written numerous books together on NetWare, Windows Server and other topics. Earl is also a regular writer for the computer trade press with many e-books, white papers and articles to his credit.

Tue, 28 Jun 2022 12:00:00 -0500 en text/html
Killexams : How Lenovo is making the transition into services

Major enterprise IT suppliers have been doubling down on services, making their offerings available on a subscription-based, pay-per-use basis and positioning themselves as one-stop shops that offer a suite of hardware, software and applications managed on behalf of their customers.

The shift toward services is being driven by more enterprises that are looking for outcome-based IT services that offer the flexibility and agility of public cloud services where users pay only for the IT resources they use.

While Hewlett Packard Enterprise (HPE) and Dell Technologies have been staking their claim in the “as-a-service” space since early on, Lenovo only started to make a deeper push into services through its TruScale offering in 2019.

On a recent trip to Asia, Wilfredo Sotolongo, chief customer officer of Lenovo’s infrastructure solutions group, spoke to Computer Weekly about the company’s growth strategy, its service-led transformation and the impact of the pandemic on its business. Here are his answers to our questions.

What is on your mind right now as you are travelling across the region?

Sotolongo: During this trip, I’m watching what progress we have made in our customer intimacy and customer relationships over the last three years. My observation is that while the pandemic has kept us apart, forcing us to interact virtually, in some places, we’ve tightened those relationships and created more intimacy with our customers. Lenovo customer-facing teams are now better than they were three years ago, listening carefully to customer pain points and needs, and then designing solutions for them.

I am also looking for evidence of where this market is going. The market has been booming over the last year, and at some point it will stop booming because every market goes through cycles. So, I’m looking for evidence of what would be the trigger for that. We are in an upcycle now, especially in Asia-Pacific, and so far I don’t see evidence of a slowdown, but I’m looking for that.

While you’re looking for answers to that, what is your go-to-market strategy for the Asia-Pacific region?

Sotolongo: We’ve got three primary strategies – one has been going on for a long time and two are relatively new. The first is what I talked about, which is the ability of our datacentre teams to develop greater intimacy and provide more solutions to clients. We also launched two new initiatives in the last 12 months – One Lenovo and service-led transformation.

Today, we have a datacentre division, a PC division and a phone division, with an overlay services group. And we execute in front of our customers, independent of each other. The great advantage of that is you have focus and disciplined execution. The disadvantage is you cannot aggregate different components of your portfolio to create solutions.

One Lenovo flips the model on its side and enables us to create complete solutions spanning the phone to the datacentre. We began the journey over a year ago and we already took a definitive position and transformed the channel. So today, we have a single channel per market team. We face distributors and resellers as one organisation. We are now running a pilot in India for One Lenovo. If that goes well, we will flip the operating model in other markets some time next year.

We’ve had so many restrictions on supply coming from so many different places that it’s a nightmare in terms of supply planning. I think we did better than our competitors, especially in the first three quarters. We did not do that well in last quarter because of the China lockdowns and we have a dependency on China that hurt us
Wilfredo Sotolongo, Lenovo

Traditionally, Lenovo has been a hardware solutions provider with little services, software and solutions content. So our service-led transformation is about having capabilities within Lenovo to deliver services built on assets we already have. We’ve created a new services and solutions group which has a mandate to grow at twice the rate of growth of the company through as-a-service offerings like TruScale and industry offerings for industries like retail and manufacturing.

Are you doing anything differently in this region?

Sotolongo: Besides the One Lenovo pilot in India, we’ve also had success with high-performance computing (HPC) in Asia-Pacific. With focus, we’ve finally cracked the code in HPC, winning customers such as the Korea Meteorological Administration and Australia’s National Computational Infrastructure. The other one is TruScale, which started as a datacentre offering but now includes PCs. We haven’t really led into the market with our services offerings until last year or so, but we’re beginning to see traction since we started to put priority on it.

Your competitors, such as Dell and HPE, are also doubling down on as-a-service offerings. How are you convincing someone to go with Lenovo versus other offerings in the market?

Sotolongo: I think the reality is, I don’t need to convince them. Customers made the bet on a new model of technology consumption years ago, starting with the cloud and all the variations, whether it’s multicloud or hybrid cloud. So, they’re on their way and we just had to show up. Right now, I would say at least half of my customer conversations include some kind of as-a-service question. It’s not because I have the offering; it’s because they’re looking to acquire and consume technology this way.

HPE started way before we did, but we’re catching up very quickly. We are definitely the underdog here. Many times, we were surprised by doing better than most people expect. We have a very simple offering. The current one is based on power consumption, but we will also have other metering measurements. We are differentiating by showing up a year ago, when we were not even showing up.

Look, the base hardware is from Intel or AMD, and while their designs are slightly different, they don’t make a big difference. The difference is: who’s across the table, listening carefully and trying to build a solution for your needs? And then, who are the individuals and what is the technology stack you’ve built to deliver on that service? Because in as-a-service, you’re no longer selling a piece of hardware. You’re selling an SLA [service-level agreement] and you’re more embedded in the relationship. If a virtual machine goes down, you’ve got to figure out why and fix it, because there are penalties.

For some years now, Lenovo has been positioning itself as more than just a supplier that ships boxes. Have you overcome that perception of Lenovo as a hardware company?

Sotolongo: We’ve made significant progress, but it’s not enough. I’m not satisfied, and I want to do more. And by the way, it’s not just my problem, it’s also a problem for the other guys. Back to your question on what’s on my mind, I’m thinking about whether we are deepening relationships, building intimacy, and you don’t build any of that by having a box conversation.

How would you know you’ve got there in terms of convincing customers that Lenovo doesn’t just ship boxes?

Sotolongo: For sure, it will be through our financial results, market share gains and margin expansion. We’ve been gaining share now for almost three years, and we’ve had margin expansion for the last three years. Still, I’m not happy.

The other indicators would be things like the nature of the business we get called to talk about. Are we getting called to talk about supplying a server configured in a certain way at the best price? Or is it about how to deploy a remote office, branch office or an edge solution?

I had lunch with a customer here in Singapore where we first talked about the hardware, but they later opened up and told us they were going to deploy more advanced AR [augmented reality] and VR [virtual reality] technologies, endpoint devices, and device management software. That’s a different conversation. That’s a way to measure. So, on the one hand, I was delighted that I visited our customer; I was even more delighted when they opened up for new use cases that we don’t normally see in the datacentre space.

How have supply chain disruptions and chip shortages amid the pandemic affected your business and relationships with customers?

Sotolongo: At least three different ways. First, customers are consuming more technology than we ever predicted. The demand last year was stronger than predicted. And so far, this year is stronger than predicted. I’m sure you agree with what I just said, because all of my competitors are saying it.

But at the onset of the pandemic, everybody was nervous of the opposite effect, where economies were going to go into recession and customers were going to stop spending and the world economy would collapse. That was a concern back then. In fact, it did collapse for a couple of months. And now, here we are two and a half years later, we just experienced the most powerful demand in years. That’s the first change.

The second change is the supply chain, which is related to demand, but the difference is, we’ve never experienced a supply chain disruption at the chip level. All the supply chain disruptions in the past were at the subsystem level, like hard drives where a flood somewhere took out some factories, but not resistors or power control chips.

We’ve had so many restrictions on supply coming from so many different places that it’s a nightmare in terms of supply planning. I think we did better than our competitors, especially in the first three quarters. We didn’t do that well in last quarter because of the China lockdowns and we have a dependency on China that hurt us.

The last is, frankly, we have proven that, as a society and as a company, we are very resilient. We figured a way to do virtually so many different things that we never thought we could do. We conducted 100% of our business virtually in the past two years. The desire was to operate in a whole new way. We’ve started to see customers face to face in the last six months. We’ve sold more than we have ever sold. We’ve supported more than we’ve ever supported. To me, it’s a very inspiring story about society and people.

The pandemic has worsened talent shortages and many companies are finding it harder to find the talent they need. What are your thoughts on the situation? How is Lenovo overcoming talent shortages?

Sotolongo: We all face talent shortages in this industry because it’s a fast-growing industry. But generally speaking, it’s not been too bad in terms of our ability to find people. There are certain areas where it’s much harder than others. For example, in the area of hyperscale sales, where we need people who interact with hyperscale customers, there are very few qualified people.

We also face challenges in customer-facing roles, but it’s no more different than it was four years ago. That said, different markets have different characteristics and I think we do better in hiring people in Singapore, Malaysia and Thailand than we do in Taiwan and Japan. But that has to do more with how a typical Japanese or Taiwanese views Lenovo, so there’s a geopolitical aspect to it.

Sun, 03 Jul 2022 22:04:00 -0500 en text/html
Killexams : Build a Career As a Consultant in Retirement

Stan Kimer hadn’t planned on working after taking early retirement from IBM at age 55. He imagined filling his days with travel and volunteer service. But during a year of transitional coaching included in his retirement package, he decided to become an independent consultant. A decade later, he runs Total Engagement Consulting, a diversity and career development firm in Raleigh, N.C.

Kimer appreciates that he can focus on his passions: diversity training with expertise in LGBTQ concerns, unconscious bias and employee resource groups, and skills-based career planning so companies can engage and retain their best employees.

That said, he made some mistakes along the way, like initially underestimating business development time. He encourages other would-be consultants to plan carefully, pursue a product or service with demonstrated demand, and expect to take years to build revenue.

“I love being my own boss and controlling my own destiny,” Kimer said. “Stepping out of your comfort zone and taking on challenges can be a great way to keep intellectually growing and staying vital while you age.”

Take Your Time

Starting a consultancy isn’t as simple as ordering a stack of business cards or creating a website. “SmallBizLady” Melinda F. Emerson recommends taking a year to lay the groundwork, or 18 months if you can. Begin by asking yourself why a consultancy appeals to you and what lifestyle you envision, says Emerson, author of Become Your Own Boss in 12 Months (Adams Media, $17).

Do you hope to work fewer hours? Do you want flexibility for leisure travel? Your approach will be different if you’re just looking for mental stimulation than if you need to replace a salary. Do you want to work remotely or at client’s offices? Do you possess the energy, self-discipline, personal savings and other resources needed to sustain you until you’re profitable?

The answer increasingly is yes, especially for retirees or people approaching retirement. The fastest growing segment of entrepreneurs are 50 years old or older, Emerson says. People considering this path should if possible, launch their new enterprise as a side hustle before they leave their day job. “I’m a big advocate of learning as many expensive lessons as you can when you’re still working for someone else,” she says.

Margot Halstead of Arlington, Va., wanted more autonomy as a government contractor while replacing the full-time income that supports her family, since her husband is fully disabled. Fortuitous connections at an industry conference gave her four promising leads, and a mentor helped her through the first year of consulting at her own firm, Orahill. “I hung my own shingle on Jan. 1, 2019, and made $80K my first year of independence. Not too shabby,” says Halstead, 52.

Define Your Niche

As an independent consultant, you’re a small business. So you need to clearly understand your target customers and what unique value you provide them. Look at other consultants in the field you want to enter. Study their websites, LinkedIn pages, client lists, marketing materials and social media presence. If you don’t see anyone offering services remotely similar to what you want to provide, beware. That could be a sign that there’s insufficient demand for your services.

Understand your strengths, skills and potential to learn—as well as the functions that you will not be able to provide personally. Are you a technical whiz? You might be able to hire a low-cost web design firm that trains you to be your own webmaster. If you’re great with numbers, dive into QuickBooks or another software package for accounting, invoicing and bill payment. If not, outsource. “You’re now doing every aspect of your big company,” says Ann Lim, volunteer mentor with the Service Corps of Retired Executives (SCORE), a national non-profit that helps business owners and aspiring entrepreneurs with human resources, sales and marketing, information technology, operations, facilities, customer service, development, finance and accounting. “The setup of a small business isn’t so complicated,” Lim says, “but it’s a lot of little things.”

Every small business should retain professional services in four areas: accounting, law, banking and insurance, says Lim, whose firm, Sweetspires, in Herndon, Va., provides general management services for small businesses. While you may not need a lawyer on retainer or an in-house accountant, a few hours’ consultation can set you up for success.

Your business may feel launched when you take on your first paying client or when your website goes live. Lim’s website,, has a checklist of common steps needed to launch. Expect to continue adjusting as your plans meet the reality of the market. Rely on your existing network and create new relationships to help answer the many questions that arise, from pricing to lead generation, contract negotiation and customer relationship management. SCORE, the Small Business Administration and professional associations in your field offer entrepreneurs resources at every stage. Consulting networks, such as Business Talent Group, GLC Group and Patina Nation, can provide a stream of business as you get up to speed on marketing—or if you prefer not to go all-in on business development.

You may need to pivot. When Paul Dillon retired from the McGladrey accounting firm in 2006, he intended to provide project management and business development services. He ended up in a completely different niche: supporting veterans as they started their own businesses. He’s grateful for the journey.

“Becoming a consultant on your own, a sole proprietorship, in your field of endeavor, offers a chance to set your own hours, and work at your own pace, while making an income,” says Dillon, 74, who now lives in Durham, N.C., while maintaining a Chicago office.

Create a Plan

Writing a business plan may seem intimidating. Explore free models online, or just write down your business idea in your own words. It should encompass a few simple elements.

  • Your value proposition, which is the answer to four main questions: “What is the demand you see? Who has the demand; who is buying those services? What is your solution? Why is your expertise better than anyone else’s?”
  • A description of your services and how they will be priced and delivered.
  • A sales and marketing plan that includes your target market. Don’t assume social media will be a component. If you’re in cybersecurity, less public visibility may be the norm.
  • An operations plan, whether you will work alone or hire a staff. Your goals for revenue and expected costs, as well as sources of capital, whether a loan or personal savings.
Fri, 31 Jul 2020 06:20:00 -0500 en text/html
Killexams : What Are New Collar Jobs? No result found, try new keyword!According to U.S. Veterans Magazine, Ginni Rometty, former chairman and CEO of IBM, coined the term "new ... nurses work closely with physicians to administer and coordinate patient care, as ... Mon, 30 Aug 2021 06:03:00 -0500 text/html Killexams : Most Innovative Schools in the North No result found, try new keyword!A private school on the Hudson River in New York, Marist College offers students unique opportunities through its partnership with technology giant IBM ... an annual question and answer game ... Tue, 03 May 2022 14:10:00 -0500 text/html Killexams : Week On Wall Street - Dazed And Confused
Solution and Strategy Path


"Psychology is probably the most important factor in the market, and one that is least understood." - David Dreman

Global Macro Scene

There has been plenty of commentary about the war in Ukraine and its impact on the global scene. There have been many instances of supply disruption due to prior Russian/Ukraine wars, and not one has put the world on the edge of a global recession. While there is evidence that this war is more intense, the facts would not support the commentary on the global impact that has been associated with this event.

Russia provides roughly 10 percent of the global supply of oil, and we have already seen evidence from the IEA that their production is at or above pre-war levels. There is the same amount of oil on the global market as there was before this new war. Ukraine supplies about 10% of the world's grain and as of today, that supply has been cut by ~60%. Russia supplies about 17% of the world's wheat and their exports have NOT been cut. They expect to export more grain than in '21.

Dmitry Rylko, the head of IKAR;

Russia may export 39 million tons of wheat in the 2022/23 season, which starts on July 1. In the current season, IKAR expects the exports at 32.0-32.5 million tons.

The country's 2022 wheat crop is expected to reach 85 million tons, in what he called a "conservative" estimate. He previously expected a harvest of 83.5 million tonnes, up from 76.0 million tonnes in 2021."

Simple math tells us the grain supply has been minimally affected if at all, and the amount of oil that is on the market has not been affected by the war. This is being confirmed by crude oil moving from $130 to $95, while wheat futures are off almost 40% from their highs including a 27% decline last month. The reality of the situation has set in as these prices along with others have lost all of the "war" premium that traders built into the equation. Is there anyone out there that wants to tell me demand has slowed for food? This is about supply and the number of commodities in short supply in this month’s Services PMI report was the lowest since February 2020. Supply hasn't been affected to the degree that is being suggested. It's a false narrative that has no credibility with anyone dealing with the facts.

The problem we face with inflation today is not due to this war, it's due to global policies that abandoned fossil fuels and added trillions of stimulus to economies that were operating at or above pre-pandemic levels. Fed Chair Powell testified to that in recent testimony before Congress.

Having said that, while an end to the war is a must from a humanitarian perspective, it will not by itself alleviate the perceived shortages or inflation issue. That leaves us with the notion that inflation is embedded in the economy and while it may have peaked it could stay elevated for a lot longer than some want to believe.

The Fed strategy MUST remain tied to inflation. The talk of 'backing off" if the economy starts to weaken is premature conjecture. I'm not contemplating a Fed "pivot" that some say is in the cards. The stock market is now totally on board with the path of rates and is starting to factor in recession to some extent. On that score, there is a growing and almost uniform belief that the U.S. is headed into a recession. There is also a growing belief that negative corporate preannouncements will start during the 3rd quarter.

I am in the recession camp, but I differ in the timing and the depth/duration of the next recession as consensus now is calling for a quick, shallow dip in economic activity. It's very early in this analysis BUT at this moment in time, I see many factors that can come along to change ANY recessionary outlook from quick and shallow to extended and deep. Therefore it's best not to put TOO much emphasis on one or the other, today. Rest assured slower growth should be part of any near-term outlook. Just how "slow" is another matter.

The Consumer

Is not confused they are dazed and upset.

Consumer spending accounts for ~70% of GDP, so it’s no wonder the market will focus intently on consumer spending patterns. The ability of consumers to spend is the primary reason for the BULL argument of no recession in '22. That is supported by strong job growth, wage gains, and abundant savings. And strong activity during the summer—air traffic, hotels, restaurants, and resorts—support the view that consumers are willing to spend on services.

However, the record lows in sentiment, which are now unprecedented, do toss some cold water on the resilient consumer theory and balance the scales to give us the other viewpoint. Along with that poor sentiment comes their views on future spending.

Consumer spend

Consumer Plans (

The average consumer also sees the employment picture weakening ahead.

Employment view

Consumer Employment (

And this summarizes all of the negative sentiment in one chart.

consumer sentiment

Overall Sentiment (


The most confusing issue these days is whether the economy is in a recession now or will be later in '22 or '23. However, this isn't something I am obsessed over. With a negative print for Q1 already in the books, if Q2 comes in negative and confirms a recession, that news isn't going to change my strategy. Remember it's the price action of the markets that will continue to determine how to navigate a market. This BEAR market trend that has developed has already signaled trouble ahead. What the markets does from here will signal how good or bad the situation gets.

We could very well be entering a sort of Limbo backdrop where it may be too late to aggressively SELL, and too early to aggressively BUY. It doesn't come more confusing than that. From a MACRO perspective, this is how the investment scene will play out in a slower growth/recessionary backdrop. The stock market will bottom first (I'm not saying we are there yet). Corporate earnings will bottom next, and finally, the economy will bottom.

So at some point, while the news is still negative and the economy is at lows, stocks will turn. However, how long that journey will take is up for debate.

The Week On Wall Street

Market participants did a lot more than partaking in the festivities surrounding the long weekend. They pondered all of the issues that face the global economy and started the short trading week in a sour mood. The selling started in Europe and quickly moved here. All indices were lower seconds after the opening bell rang on Tuesday. The indices staged a mini-rally led by the NASDAQ (+1.5%) that left the S&P flat on the day.

The confusing week continued as Wednesday started as if the "BULL" was back only to see most of the gains lost when the closing bell rang. It sure looked like resistance levels were going to dominate again and turn the mini-rally around, but more buyers showed up on Thursday. The Chinese markets posted gains overnight, stocks rallied in Europe before the U.S opening bell and that spilled over to the U.S. equity market.

Thursday turned out to be the fourth straight day of gains for the S&P 500. That may not sound all that impressive (it isn't), but in a year like 2022, it is enough to be tied for the longest winning streak of the year. Earlier this year in Q1, there were three other streaks of similar duration, but in Q2, the best the S&P 500 could do was three days.

The four-day win streak for the S&P ended on Friday as the index closed with a modest 3-point loss, but it was able to post the third positive week investors have seen in the last 13 weeks. It has been one tough year for the BULLS. The NASDAQ continued to "outperform" during this mini rally as it did produce a 5-day winning streak with a slight gain on Friday.

The Fed & Inflation

Despite the incessant talk about interest rates, the Fed has still only hiked rates by 150-175 basis points. In my view that isn't enough to slow demand, as any FED action takes months to filter through the economy. Yet we have seen signs of demand destruction, and that is more of a sign that corporations and consumers are scaling back due to the myriad of issues present.

While economic data has slowed down, if not, outright contracted, the Federal Reserve remains dead set on getting interest rates higher to combat inflation.

Persistent above-average inflation isn't positive, but runaway inflation is a killer. The recent data on inflation does show stabilization, and this is good news in the sense it probably indicates we won't experience runaway inflation.

Minutes from the last Federal Reserve meeting;

"In their consideration of the appropriate stance of monetary policy, participants concurred that the labor market was very tight, inflation was well above the Committee's 2 percent inflation objective, and the near-term inflation outlook had deteriorated since the time of the May meeting. Against this backdrop, almost all participants agreed that it was appropriate to raise the target range for the federal funds rate 75 basis points at this meeting. One participant favored a 50 basis point increase in the target range at this meeting instead of 75 basis points."

"All participants judged that it was appropriate to continue the process of reducing the size of the Federal Reserve's balance sheet, as described in the Plans for Reducing the Size of the Federal Reserve's Balance Sheet that the Committee issued in May... In discussing potential policy actions at upcoming meetings, participants continued to anticipate that ongoing increases in the target range for the federal funds rate would be appropriate to achieve the Committee's objectives."

In particular, participants judged that an increase of 50 or 75 basis points would likely be appropriate at the next meeting. Participants concurred that the economic outlook warranted moving to a restrictive stance of policy, and they recognized the possibility that an even more restrictive stance could be appropriate if elevated inflation pressures were to persist."

These minutes reflect "the way it is" and in my view support my feeling that this talk of a FED pivot, is conjecture. Until inflation starts signaling it has subsided and will continue a path lower the Fed will continue its path to a more restrictive policy.

The Economy

GDP is backward-looking, and we still have yet to see any of the hard economic data regarding economic activity in June, but if the regional Fed manufacturing reports that comprise the Five Fed Manufacturing Index are any indication, the results won’t look pretty. The Atlanta GDPNow model looks at that data and much more is now forecasting negative growth of -1.2% in Q2. Q2 GDP is now tracking close to what we experienced in Q1 (-1.7%)

When I saw all of the early economic forecasts at the end of last year that were suggesting growth to be steady in the first half of the year, I believed they were too high and said so, BUT I have to admit I didn't see this much weakness. We'll soon see how Q2 shapes up and the forecasters are already out talking about a 2nd half pick-up.

We should be wondering where is the growth going to come from? There are no pro-business initiatives on the table. What is on the table is another proposal for tax increases, and more anti-business rhetoric creating more uncertainty. Unless inflation subsides quickly and dramatically (low probability) corporate America is going to go into a shell, reduce hiring plans, and in some cases initiate layoffs. The consumer is telling a story that they aren't thrilled with the direction of the economy. What plans are in the making for them to flip and turn positive? The polling and data reports tell the story. The sour mood can't be "spun away", by remarks to "deal with it".

If it hasn't started already it seeps into the economy and the markets. That keeps the slower growth model in place longer. Inflation is worse than a recession in that it affects everyone.


Some good news on the manufacturing front. U.S. factory orders surged 1.6% in May, double expectations, after increases of 0.7% in April and 1.8% in March. This is an 8th consecutive monthly gain, with five of the months climbing better than 1%.

However, the ISM services index slid another 0.6 ticks to 55.3 in June after falling 1.2 points to 55.9 in May, reflecting a continued slowing in activity. It is a third straight monthly decline after rising to 58.3 in March. And it is the lowest since 45.2 from May 2020. The index was at 60.7 last June. The components were mixed.

The final US Services PMI Business Activity Index registered 52.7 in June. The index dropped for the third month in a row and was down from 53.4 in May, pointing to the weakest rise in activity since January. New orders declined for the first time since July 2020.

Employment Scene

Nonfarm payrolls rose 372k in June. This follows prior gains of 384k in May and 368k in April. The jobless rate was steady at 3.6% for a fourth straight month. Average hourly earnings were up 0.3% from 0.4%, leaving the 12-month pace slipping to 5.1% y/y versus 5.3% y/y. The labor force participation rate dipped to 62.2% versus 62.3%.

Job openings fell 427k to 11,254k in May after sliding 174k to 11,681k in April. Hirings slid 38k to 6,489k following the 118k drop to 6,527k. The hire rate was unchanged at 4.3%. The quit rate slipped to 2.8% from 2.9% but is just off the 3.0% record high from November.

Initial jobless claims remain historically healthy in the low 200K range, but the most recent week's data did mark one of the highest readings of the year. Coming off of last week's unrevised 231K, claims rose 4K to the highest level since the second week of the year when they clocked in at 240K. That remains a much better practicing than what was observed throughout much of the history of the data, but it is at the higher end of pre-pandemic readings (those from roughly 2017 through 2019).

The Jobs picture remains strong and that adds more confusion to what has become a very complex economic situation. While there is plenty of data to support a slow-down morphing into recession, the jobs picture is not confirming that.

The Global Scene

Recession fears were front and center this week, especially in Europe as concerns of natural gas shortages heading into the winter months put the likelihood of recession as near-certain with the only question being how long and how deep.

This chart is all one needs to see to determine what is going on in the global manufacturing economy.

Global PMI

Global PMI (

Only one country is improving - China. If these trends continue, especially in the U.S., China will find itself at the top very soon.

Chinese services companies registered a steep increase in business activity during June, according to the latest PMI data. At 54.5 in June, the seasonally adjusted headline Business Activity Index rebounded sharply from 41.4 in May to signal a renewed and marked rise in activity across China's service sector.

China Svcs

China Services (Caixin, S&P Global)

While China continues to rebound, slowing growth is the theme in the EU.

Eurozone growth slows to a 16-month low in June. The final Eurozone Composite Output Index comes in at 52.0 (May: 54.8). While the final Eurozone Services Business Activity Index is at 53.0 (May: 56.1) which is a 5-month low.

The U.K.

The headline seasonally adjusted S&P Global / CIPS UK Services PMI Business Activity Index registered 54.3 in June, up slightly from 53.4 in May and above the 50.0 no-change mark for the sixteenth month running. However, the average practicing in the second quarter of 2022 (55.6) was well below that seen in the first three months of the year (59.1). New order growth dropped to 16-month lows in June as economic uncertainty and rising inflation hit discretionary spending.

JAPAN and INDIA are following China's road to recovery.


The seasonally adjusted Japan Services Business Activity Index rose from 52.6 in May to 54.0 in June, indicating a solid rise in activity that was the quickest since October 2013.

Rising from 58.9 in May to 59.2 in June, the India Services PMI Business Activity Index was at its highest mark since April 2011 and signaled a steep rate of increase. Moreover, the acceleration in growth was broad-based across the four monitored sub-sectors. According to panelists, the upturn stemmed from ongoing improvements in demand following the retreat of pandemic restrictions, capacity expansion, and a favorable economic environment.

Political Scene

From a macro risk perspective, Taiwan is viewed as a core aspect of the 'wall of worry' which has built up over the potential for significant economic disruptions during a period of heightened uncertainty given continuing tensions between the US and China over the fate of the island. Geopolitical risk will continue to be a driver of uncertainty for markets, especially as Europe’s security architecture continues to transform in the wake of Russia’s invasion of Ukraine. Sweden and Finland’s now agreed NATO ascension will pose new challenges as the integration is formalized and Russia’s reaction/any retaliatory steps become clearer.

In Taiwan, a risk of direct military confrontation between the US and China is of rising concern (especially in 2023 and beyond) but should be viewed as a lower likelihood. While many analysts' base case remains that tensions over Taiwan will remain managed and controlled, the tail risk of expanded conflict becomes a more significant factor in 2023 and beyond, especially if US foreign policy becomes more assertive toward Taiwan.


The last 3 weeks have seen the S&P 500 climb back from its lows, sentiment appears to show that investors are not buying into any positive price action. In this week's update of the AAII sentiment survey, there was an overall push toward more bearish tones.

The percentage of respondents reporting as bullish fell back below 20%. Even though that is not any sort of new low, this week is the fifth in a row with less than a quarter of respondents reporting as bullish. Such a streak has been unprecedented with the last example of such an extended streak of depressed sentiment being May of 1993.


Moving into Q2 earnings season with the economy on the verge of or already in a recession, there’s understandably a lot of concern over how results will come in relative to expectations and then how will stock prices react to those results. There’s a general view that estimates are too high (my feeling as well), and companies are poised to come up short when they report. We’ll only know the answer as the reports come in.

We are seeing plenty of downward revisions to EPS estimates, and whether the downward revisions will set the bar low enough remains to be seen. Earnings season hasn't officially started but some reports have hit the wire. It’s admittedly a very small sample size, but of the results we have seen already, companies haven’t had any trouble meeting analyst expectations. Guidance is another story, though.

Food For Thought

A recent Supreme court ruling has tossed potential uncertainty into the entire "green energy" carbon emission and ESG mandates. Regulatory burdens that have been imposed by agencies that are led by non-elected officials are now challenged.

Their recent decision states federal agencies have assumed too much regulatory power and should be reined in. Instead of passing laws regarding matters such as regulating carbon and energy policy that has National Security implications, the executive branch (administration) has decided Federal agencies should dictate policy. No more.

"The federal bureaucracy is no longer allowed to impose programs of major 'economic and political significance on the country absent 'clear congressional authorization'"

In translation, the mandates proposed by the EPA are overreaching and violate the constitution. The ruling states that Congress must act and laws must be passed instead of an agency dictating policy. In doing so these "agencies" have circumvented the representatives that act on behalf of the people.

The same will hold for the SEC which has decided to pass regulations requiring corporations to comply with their rules regarding the environment and the infamous ESG mandate. These "rules" can now be challenged as "unconstitutional" and have the Supreme court ruling as a tailwind.

I doubt this will have a major impact on the transition to a clean environment and that wasn't the intent of the court's ruling. However, we could see companies start to balk and object to many of the regulatory burdens that are involved with the ESG movement, and the cumbersome regulations imposed by the EPA. One of the first challenges will come IF the latest EPA ruling on ozone levels for the Permian Basin in the state of Texas is allowed to become law, it will have the potential to curb drilling and if nothing else make operating there more expensive. For those that may not know the Permian Basin is one of the richest land masses for the development of Oil and natural gas in the world.

The court's ruling is all about taking unchecked power from a single entity and bringing it to the representatives of the people, congress. Under the constitution, they and they alone have the power to make laws that will have a major impact on society, the economy, and national defense. One way or another this war on fossil fuels has to come to an end. It is another reason I have my doubts about a quick, shallow recession.

States and corporations will have the law and the constitution on their side now, and corporations have a big incentive to do so. It is estimated that a large corporation will spend a half-million dollars on filing the paperwork that is required to disclose its ESG status and initiatives. That is something they might want to debate now.

This supreme court ruling is a game-changer that can end these unconstitutional regulations.

In the long run, that is a positive for the economy and the markets, BUT in the interim, it could be a long and hard tug of war in the courts. Only a change in policy that effectively neuters these agencies and dismantles the unconstitutional regulations bring this to an end.

More Stimulus?

The lack of economic understanding isn’t confined to the Federal level. California last week doubled down on the stimulative policies (other states are also contemplating this policy) that heavily contributed to the mess we’re in by announcing the largest direct payments from a state ever in the form of inflation relief stimulus.

It has been decided the best way to extinguish a fire is to pour gasoline on it. Any stimulus is only going to increase energy demand while doing nothing to create the additional supply necessary to significantly bring down energy prices in the long term that have become such a burden on consumers. Without a realistic understanding of the forces that produce inflation, the only “solution” we’re likely going to get is an eventual recession bad enough to crush demand and bring everything down with it. That seems to have already begun.

Then again, we can't forget the political angles played in an election year. That extra pocket money will find its way into the economy and is sure to keep inflation at higher levels, and of course, the checks will be delivered to consumers just in time for the mid-term elections.

A new "GREEN" Tax

The RUSH to go green starts with the EV revolution. So California has decided to help get that industry off the ground by taxing it (sarcasm intended). California will now tax Lithium, the metal that is essential to an EV battery.

However, there is no discrimination when it comes to taxes. Californians contending with the highest gas prices in the nation are now paying another 3 cents per gallon due to the state’s annual gas tax increase that took effect July 1st. Only Pennsylvania has a higher excise tax on Gasoline. I'd like to believe they are trying to de-incentivize the purchase of gasoline, but in California, one can't simply get from point A to point B without a vehicle. There are about 700K EVs on the road there compared to the 17.5 million combustion engine vehicles registered.

The crowd that simply can't get anything right continues to make policy error after policy error as the demonization of the corporate world and anti-business backdrop remains an integral part of the policy. Lowering the "boom" on Corporate America isn't going to help the consumer situation. After all, it is Corporate America that allows everyone the opportunity to Excellerate their status. This antagonistic mindset has a tremendous effect on the economy and eventually the stock market, and when the economy is slowing is a HUGE negative.

Analysts that are calling for a quick/shallow recession, better start to factor "policy" into the equation because that is a wild card that can drag this slow growth period on for a lot longer.

The Daily chart of the S&P 500 (SPY)

Short-term resistance was overcome this week, but there are plenty of other hurdles for the S&P and other indices to overcome. I noticed the tone of the equity market did change slightly this week.


S&P July 8th (

Compare to what we have witnessed during this steep decline, anytime the indices dipped, buyers stepped in. It is a small first step but the BULLS have to start somewhere. The downtrend is still very evident, and all of the technical damage that has been done will have to be repaired. That has the look of a long-drawn-out affair.

Investment Backdrop

The Energy sector got its turn in the woodshed and that leaves no sector unscathed this year. Despite the bashing, the Energy sector remains in a longer-term BULL trend, and other than Healthcare is the only sector that can say that. The Commodity sub-sector ETFs (those with heavy oil exposure) also took it on the chin and have dropped out of a bullish trend now. The Agriculture sub-sector fell apart last month and it too is no longer in an uptrend that can be relied on.

Since commodities in general and agriculture have broken down totally, I'm hard-pressed to say this is a reversion to the mean type of situation. Only a quick reversal and a retake of the trend line would change my view. Perhaps the markets are signaling inflation has peaked and will subside from these levels. Then again commodities tend to get whipsawed by emotion, so anything can occur.

From a broader market perspective, the drop in Energy could also very well support that a major bottom in stocks is trying to form. As the last holdout of the selloff in equities, Energy could be the last domino to fall before they are set back up again. In addition, this great rate pullback has now dropped the 10-year treasury yield to 2.78% from its high at 3.49% on June 14th. That headwind has now become a tailwind. That has spurred the turnaround situations in the speculative areas of the market ARKK and Biotech (XBI) Both have followed through nicely since they were highlighted here in Late May. Those were the areas that topped the earliest and led on the way down earlier in the bear market but now they are outperforming.

In a recent update to members and clients I simplified the market into three different categories to take advantage of the mini-rally this past week, all based on primary trends and risk/reward setups.

Thank you for practicing this analysis. If you enjoyed this article so far, this next section provides a quick taste of what members of my marketplace service receive in DAILY updates. If you find these weekly articles useful, you may want to join a community of SAVVY Investors that have discovered "how the market works".

The 2022 Playbook is now "Lean and Mean"

Yes, that is correct, opportunities are condensed in Energy, Select Commodities, Healthcare, Biotech, and China. The message to clients and members of my service has been the same. Stay with what is working.

Each week I revisit the "canary message" which served as a warning for the economy. The focus was on the Financials, Transports, Semiconductors, and Small Caps. I used them as a "tell" for what direction the economy was headed to help forge a near-term strategy. They sent their messages for the economy and since that day the S&P is down 15%. There will be times when the canaries appear to be revived, but, until there is a decided swing in the technical picture where rallies take out resistance levels, they continue to warn about the near-term outlook.

Fixed Income

Across the Treasury yield curve from the 2-year point on out, yields have broken their uptrends that have been in place since Q1. However, they are at support levels. Whether this was the result of quarter-end rebalancing or a true flight to safety trade remains to be seen.

10 year

10-year treasury (

The 2-10 spread is now inverted as the 2-year Treasury yields 3.12% which is higher than the 10-Year. While not an immediate signal of recession, the longer the curve stays inverted the higher the chance for recession.

What are the Credit Markets Telling Us?

Credit spreads are often a good barometer of economic health - widening (BAD) and narrowing (GOOD).

Last week I noted that spreads were challenging the March and May highs. This week they broke above those peaks.


Credit spreads (

Unless we get a quick reversal, that signals more market "stress" ahead.

Small Caps

The Russell 2000 has already made a round trip back to the 2020 breakout level. So far that support has held. The unknown is whether the other major indices will follow this same path and if that is the case there is a lot more downside ahead for them. In the meantime the Small caps appear to be tracing out a sideways pattern signaling their BEAR market correction is bottoming. However, if the small caps break the support here it's another dire warning about this economy, and more than likely the other indices set new lows as well.



A while back I mentioned there were only two ways to lower the price of Crude oil. Increase supply or go into a global recession. Since we have seen no effort made to increase supply, the path that was chosen is a global slowdown that could morph into a recession that eventually kills demand. The market has figured that out as well as the price of crude has dropped significantly in the last 5 weeks.

It's been my premise that with the EU mired in a major energy crisis there is little hope they can avoid recession, and the markets are concerned they will be the first domino to fall, and then we'll have to see if this contagion starts to spread. The premise now is oil demand will fall as the entire globe is in recession. China is a wildcard in that its economy has turned and could balance some of what appears to be demand destruction caused by a recession.

In the interim Energy remains in a BULL market trend. The quick drop has been unsettling, but with no change in policy, I'm not ready to pull the plug on the sector just yet.

The Interior Department issued its proposals for a drastic reduction in offshore oil, and gas lease sales between 2023-2028. The proposed five-year plan removes the Atlantic and Pacific coasts from consideration and also floats the possibility of scrapping all offshore sales for the period.

So the war on fossil fuels continues unabated, and that should keep a floor under crude oil prices. The Energy ETF (XLE) is now at long-term support and if these levels do not hold, I will be forced to re-evaluate the sector. I've concentrated on stocks that pay above-average dividends and that has helped me navigate this period. These companies will be able to maintain 7-8% yields with WTI at $85-$90. WTI closed the week at $103.

Last weekend I added a special section to my weekend updates at the Savvy investor to cover the opportunities present in Liquified Natural Gas.


The Financial Sector ETF (XLF) shows a rounding TOP pattern, but has now put in a series of higher lows since mid-June and the ETF also was able to move above its first resistance level. A turnaround here would go a long way in keeping any rebound rally in the market alive.


Mortgages are the Most Expensive Since the Real Estate Bust

After seeing about 40 years of mostly declining mortgage rates, the trend looks to have reversed. The 30-Year Fixed Rate Mortgage Average in the U.S. is now at its highest point since the Financial Crisis/Real Estate Bubble bust at 5.81%.


Mortgage Rates (

A year ago, a 30-Year mortgage was still under 3%. In aggregate, it’s a big difference. It's a prime reason the housing sector and home builders are under pressure. One positive for the group, lumber prices have come down substantially in the last few months.

I will note that the Homebuilder ETF (XHB) has outperformed the S&P in the last 3 months, but with all of the major homebuilders in BEARISH trends, I wouldn't be inclined to bottom fish in this pond. If I were inclined to do so, Toll Brothers (TOL) sells to the higher-end consumer that is less affected by all of the negatives in this economy. The chart shows a potential rounding bottom pattern signaling a possible end to its long downtrend.


S&P 500 Healthcare’s P/E is 15.8x, lower than the general market meaning there could be more room for this group to continue to outperform. The Healthcare ETF is back in a Long Term BULLISH mode after the recent three-week rally.

For now, we will have to assume that the prior move lower was a false breakdown and the rally left the group back in more of a sideways pattern. In this market sideways is a WIN. Look for the individual stocks that possess a good LT BULLISH trendline.


A successful test of the lows has led to a mini-rally that suggests this bottoming process is in place, and so far looks good. The XBI appears to be setting up for a BEARISH to BULLISH reversal. This was further confirmed when the ETF moved above formidable resistance levels. I continue to believe there will probably be a lot of give-and-take action. If XBI can now stay above the former resistance level, it will turn into support. This trade has turned into a 30% gain for Savvy Investors since late May. Harvesting profits on the long side in a BEAR market goes a long way in keeping a portfolio above water.


The metal settled recently at its lowest level since last September. I kept repeating my views that if gold couldn't rally with inflation at 40-year highs what will it take to move the metal higher? That view has been correct. Gold has been an incredible disappointment over the past couple of years and the miners have arguably been more so. While they have had some decent periods, ultimately the popular “inflation hedge” has just cost investors a lot of money.

I continue to avoid the commodity.


The majority of technology remains in a downtrend and will have a lot of work to repair some of the technical damage that occurred in the last few months.

Despite the Technology ETF (XLK) being in a BEAR trend, there are some hidden gems in the sector. IBM is one such diamond in the rough it's a stock that is hated by some. However, when I see a stock in a BULLISH configuration, yielding 4.7% while the general market is mired in a BEAR market, I am inclined to keep it in my portfolio.


I have often referred to the semiconductor sector as a leading indicator for the broader economy and market. It is one of the canaries that I constantly watch. Unfortunately for bulls, June was a horrible month for the sector. The Philadelphia Semiconductor Index (SOXX) dropped over 25% for its weakest quarter since Q4 2008. It was also the first time since Q4 2008 that the SOXX dropped by double-digit percentages in back-to-back quarters.

Since its peak late last year, SOXX has been trading in a steady downtrend channel but has been spending a lot more time at the lower end than the upper end of the channel. The ETF recently broke another support level.

While we can see a rebound in the coming days back to the upper end of the channel, it may be a signal to re-evaluate and add the inverse ETF position (SSG).

Anyone bottom fishing in this group should have tight mental stops in place. I will need to see strong evidence that this group has bottomed before I change my outlook. I don't know where this downtrend ends but despite this mini-rally, it has the look of going a lot further. When we consider the fact that the semis have been stellar outperformers for a very long time, the reversion back to the mean can often be very extensive. I remain cautious.

ARK Innovation ETF (ARKK)

The Bearish to Bullish reversal pattern that I highlighted in May is still in play. There is now mounting evidence that a bottom in this ETF may have been established. A series of higher lows after the ETF retested the May lows gave me the confidence to advise aggressive investors to get involved. the Risk/reward at the time was compelling. The risk didn't materialize but the reward to the tune of 28% has. Another area where gains are being harvested.

ARKK represents the more speculative areas that topped out first last year and have demonstrated relative strength lately. Perhaps this is a sign that the overall market may be in the same process now and a bottom will soon be established in the indices.

International Markets


One of the reasons I believed China was in a turnaround situation revolved around some of the recent commentary coming from President Xi.

“We will step up macroeconomic policy adjustment, and adopt more forceful measures to deliver the economic and social development goals for the whole year and minimize the impact of COVID-19,”

Unlike all other developed countries, China has plenty of tools in its toolbox to stimulate an economy that has no worries about high inflation. Their backdrop is a change to a positive outlook. The US and other global markets are in the midst of change to a negative outlook.

So while China is in recovery mode and the US economy remains a question mark, the administration is contemplating lending a helping hand to China in the form of easing tariffs on Chinese goods.

It appears the Chinese CSI 300 ETF (ASHR), along with other Chinese stocks have put in a bottom, then tested those lows, and have now rallied hard in the face of a changing (positive) fundamental picture.

I continue to believe the Chinese markets are in the process of tracing out a BEARISH to BULLISH reversal pattern. ASHR, FXI, BABA, JD, KWEB and others are decent risk/reward setups now, especially on dips.


Not only has it been a historically bad first half for equities, but with June and Q2 now in the rearview, Bitcoin posted the worst month and quarter on record going back to at least 2014. In June, Bitcoin declined by 37.8% after two other monthly declines of over 15% in April and May. That drop surpassed the prior record decline of 36.4% in November 2018 and the runner-up, a 35.4% drop in May of last year.

Bespoke Investment Group;

Although investors may have the urge to buy the dip, historical performance following monthly declines of 15%+ has not been positive in the near term. The world’s largest crypto has averaged a decline in the next week and month whereas all months have been more consistently positive. Again, every month of Q2 saw Bitcoin fall at least 15% from month end to month end, and unsurprisingly as a result, that meant Q2 was the single worst quarter to date for the crypto.

From the end of March through the end of June, Bitcoin had fallen 56.6%. That places Q2 ahead of the Q1 2018 decline of 50.7%. In total there have now been seven quarters in which Bitcoin has fallen at least 15%, and only six of the fifteen aforementioned worst months on record fell outside of those quarters. As with the ends to the worst months, the worst quarters have typically seen further moves to the downside at the opening of the next quarter. Farther out over the next two quarters, performance remains mixed with the following quarter positive half the time and two quarters later it has been positive only a third of the time.

Final Thought

During my investment career, I have been associated with people that have a Ph.D. in this, a Ph.D. in that, and those that have a high school diploma. The successful investor realizes the role that human emotion plays in managing money. It's not about the number of degrees, it is about common sense. Over and over, we talk about avoiding the noise. The reason is simple, once that is accomplished it gives you the chance of applying that common sense to the situation.

Our minds take us to a place where we believe in our own experiences. Unfortunately, they make up a tiny fraction of how the world and markets work. Think about it for a minute. Someone who started investing and has seen nothing but improving markets has a far different view from anyone that started investing in 2007. Each has a myopic view of the total picture, yet both believe their views ARE the way the markets will evolve. Of course, nothing could be further from reality.

Investors also get trapped when they refuse to have an open mind when investing. The cornerstone of economics is that things change over time because as we have seen, the situation doesn't stay too good or too bad indefinitely. However, the last situation like the end of a movie stays with us for a long time. It's why so many people got caught up in the rhetoric. Many investors could not buy into the fact that the investment scene changed in February.

Listen To The Market

The stock market started on a series of signals, but few listened. All one had to do is watch what was developing at the time, instead of reacting to any news that may have been cluttering the financial airwaves at the time Instead, we heard "The economy isn't that bad", and "there won't be a recession" dominated the biased 'politically correct' headlines that for some reason couldn't deal with the reality of the inflationary backdrop that started in 2021. Outright denial of the facts has led many to stay on board this market for far too long and it's taken them over the cliff.

Facts versus fiction, conjecture, and speculation. Facts trump all and the stock market's price action is FACT. Investors like to talk themselves into a lot of mistakes by allowing their emotions to rule their decisions. At the Savvy Investor Marketplace Service, that strategy isn't allowed. I like to navigate the market scene with common sense based on reality. Back in 2013, I stated back right here on Seeking Alpha that the BULL market will be a life-changing event if one takes the time to understand what is happening. That call took a lot of criticism at the time but it has been correct.

Market participants are "Dazed and Confused" because of the crosscurrents that can serve up a myriad of possibilities for the markets. One of these is a BEAR market scene that has the potential to be a life-changing event as well. Understanding the nuances and specifics of how to navigate this scene will separate the winners from the losers.


Please allow me to take a moment and remind all of the readers of an important issue. I provide investment advice to clients and members of my marketplace service. Each week I strive to provide an investment backdrop that helps investors make their own decisions. In these types of forums, readers bring a host of situations and variables to the table when visiting these articles. Therefore it is impossible to pinpoint what may be right for each situation.

In different circumstances, I can determine each client’s situation/requirements and discuss issues with them when needed. That is impossible with readers of these articles. Therefore I will attempt to help form an opinion without crossing the line into specific advice. Please keep that in mind when forming your investment strategy.

THANKS to all of the readers that contribute to this forum to make these articles a better experience for everyone.

Best of Luck to Everyone!

Fri, 08 Jul 2022 18:07:00 -0500 en text/html
Killexams : Opinion: The quiet, compelling leadership of Shinzo Abe

The stunning, senseless death of former Japanese Prime Minister Shinzo Abe by an assassin’s bullet in Nara illuminates the vulnerability of political leaders. Abe’s legacy reminds us of the powerful influence of quiet diplomacy and how one might think usefully about political leadership.

Shinzo Abe and quiet diplomacy

Part of the measure of any administration’s success in foreign policy rests on the personal relationships established by leaders. Ronald Reagan and Margaret Thatcher, Bill Clinton and Tony Blair, and George H.W. Bush and Brian Mulroney are examples of the benefits that flow when leaders establish trust and admiration. Shinzo Abe, during his tenure as the longest serving Japanese Prime Minister in the last three quarters of a century, was as skilled as any foreign leader in establishing such relationships with George W. Bush, Barack Obama and Donald Trump.   

A major figure in Japan’s Liberal Democratic Party, Abe’s nonjudgmental personality and cheerful disposition were key to his success. He was the first foreign leader to contact Donald Trump following his 2016 election, a courtesy that helped begin their relationship on a positive footing.  

Matt Pottinger, the deputy national security adviser, confirms that Trump had more interactions with Abe than with any other foreign leader. They genuinely seemed to enjoy one another including their shared love of golf. Their lengthy discussions ranged from coordinating the response to North Korea’s nuclear and missile tests to fashioning a bilateral U.S.-Japan trade agreement. Concluded 212 years ago, 90% of U.S. food and agricultural products imported into Japan are now duty free or receive preferential tariff access.

The Trump and Biden administrations have embraced Abe’s idea of a “free and open Indo-Pacific” and of a relationship of the Quad countries — the United States, Japan, India and Australia — designed to strengthen the rule of law and mutual security arrangements among other objectives. Much of Abe’s impetus for the Quad was his concern over the threat of a more aggressive and authoritarian regime in China which referred to the Quad as the Asian NATO.

Following the 2016 election, when the U.S. determined not to submit the recently negotiated Trans Pacific Partnership for congressional approval, Abe led the effort to conclude an agreement among the remaining 11 TPP countries winning support for a Comprehensive and Progressive Agreement for Trans-Pacific Partnership that largely maintained the TPP agreement and left the door open should the U.S. choose to join.  

What Shinzo Abe knew about ideas

In 2015 Abe came to Harvard as part of a U.S. visit that included the first time a Japanese Prime Minister had addressed a joint session of Congress. His remarks were not merely perfunctory but included a sense of how he viewed the rising generation and especially the role that women could and should play in its leadership. He came across as an idealist without illusions answering questions forthrightly and with respect for those who asked them.  

He included facts and statistics that demonstrated he understood the intricacies of a challenge and the range of viable options. At the same time, he neither denied nor dismissed those who advanced challenging questions. Not least, he punctuated his answers with a touch of humor noting in response to a question about the role woman can play that if Lehman Brothers had been Lehman Brothers and Sisters they might still be in business.

For political leaders, ideas can matter in two important senses. 

First, they provide a framework and vision of what is important and of goals that are worth pursuing. The status quo is buttressed with a good deal of inertia. Societies benefit from having a measure of stability.  Such stability enables individuals and organizations to plan with confidence and can contribute to establishing trust and confidence in institutions and societies.  

At the same time, great individuals, organizations and societies embrace a measure of dynamism. They do not settle into a comfortable complacency. They are willing to make changes that inevitably involve some risk. Finding an appropriate balance between stability and change is one of the most important tasks of leaders.

Shinzo Abe had a fresh vision for Japan and its place in the world. He recognized the challenge presented by the current leadership in China. He saw the value of a coordinated response by nations that found authoritarianism counterproductive and that embraced individual liberty, free market economic arrangements, and that guaranteed the rule of law.  

Political leadership also involves a sustained following through moving the public, the permanent government bureaucracy and other officials at home and abroad. This often takes many years. Abe’s idea of a Quad involving Japan, the United States, India and Australia as a bulwark in support of democracy, took more than a decade to take root and to be embraced by its members and acknowledged by China and others.

His patience never failed.

Shinzo Abe was skilled in putting together alliances that respected distinctive national interests and yet could find much in the way of common ground. His success went well beyond his charm and charisma.  It rested on determination, resilience and pleasant persistence.

Roger B. Porter, IBM Professor of Business and Government at Harvard University, served as the assistant to the president for economic and domestic policy from 1989-93.

Thu, 14 Jul 2022 09:51:00 -0500 en text/html
Killexams : IBM acquires Israeli startup Databand to boost data capabilities

US tech giant IBM said Wednesday that it acquired Israeli startup, the developer of a data observability software platform for data scientists and engineers, to strengthen the multinational’s data, artificial intelligence, and automation offerings.

The terms of the acquisition were not disclosed. According to the agreement, Databand employees will join the IBM Data and AI division to further enhance IBM’s portfolio of data and AI products including its IBM Watson, a question-answering computer system, and IBM Cloud Pak for Data, a data analytics platform.

IBM said the acquisition was finalized in late June and that the purchase will build on IBM’s research and development investments, as well as strategic acquisitions in AI and automation. Databand is IBM’s fifth acquisition this year, the company noted.

Databand was founded in 2018 by Josh Benamram, Victor Shafran, and Evgeny Shulman, and rolled out a software platform that the company says helps enterprises and organizations get on top of their data to ensure “data health” and fix issues like errors and anomalies, pipeline failures, and general quality.

The data observability and data quality market is likely to see further growth, as more organizations look to closely track and protect their data. A Statista report estimated that the sector will grow from about $13 billion in worth in 2020 to almost $20 billion in 2024.

Based in Tel Aviv, Databand has raised about $20 million, according to the Start-Up Nation Finder database, with investors such as VCs Accel, Blumberg Capital, Ubiquity Ventures, Bessemer Venture Partners, Hyperwise, and F2 Ventures.

“By using with IBM Observability by Instana APM [an application performance monitoring solution] and IBM Watson Studio, IBM is well-positioned to address the full spectrum of observability across IT operations,” IBM said in the announcement Wednesday.

“Our clients are data-driven enterprises who rely on high-quality, trustworthy data to power their mission-critical processes. When they don’t have access to the data they need in any given moment, their business can grind to a halt,” said Daniel Hernandez, general manager for IBM Data and AI, in a statement.

“With the addition of, IBM offers the most comprehensive set of observability capabilities for IT across applications, data and machine learning, and is continuing to provide our clients and partners with the technology they need to deliver trustworthy data and AI at scale,” he explained.

Benamram, who serves as Databand CEO, said: “You can’t protect what you can’t see, and when the data platform is ineffective, everyone is impacted –including customers. That’s why global brands such as FanDuel, Agoda and Trax Retail already rely on to remove bad data surprises by detecting and resolving them before they create costly business impacts.

Joining IBM will help Databand “scale our software and significantly accelerate our ability to meet the evolving needs of enterprise clients,” he added.

Databand is one of a number of leading Israeli data observability companies including Coralogix, which raised a $142 million Series D funding round announced in May, and Monte Carlo, which secured a $135 million Series D round at a valuation of $1.6 billion, also in May.

Separately, IBM has been active in Israel for decades and runs an R&D center in Tel Aviv and a research lab in Haifa.

The Haifa team is the largest lab of IBM Research Division outside of the United States. Founded as a small scientific center in 1972, it grew into a lab that leads the development of innovative technological products and cognitive solutions for the IBM corporation. Its various projects utilize AI, cloud data services, blockchain, healthcare informatics, image and video analytics, and wearable solutions.

Join our Israeli cooking experience!

Israeli cooking is taking the world by storm. Here’s your chance to join in...

The Times of Israel Community is excited to present our new virtual cooking series, B’Teavon, where world renowned chefs show you how to make classic and modern Israeli dishes.

Learn more Learn more Already a member? Sign in to stop seeing this

You're a dedicated reader

We’re really pleased that you’ve read X Times of Israel articles in the past month.

That’s why we started the Times of Israel ten years ago - to provide discerning readers like you with must-read coverage of Israel and the Jewish world.

So now we have a request. Unlike other news outlets, we haven’t put up a paywall. But as the journalism we do is costly, we invite readers for whom The Times of Israel has become important to help support our work by joining The Times of Israel Community.

For as little as $6 a month you can help support our quality journalism while enjoying The Times of Israel AD-FREE, as well as accessing exclusive content available only to Times of Israel Community members.

Thank you,
David Horovitz, Founding Editor of The Times of Israel

Join Our Community Join Our Community Already a member? Sign in to stop seeing this
Wed, 06 Jul 2022 06:54:00 -0500 en-US text/html
Killexams : So, You Think You Can Design A 20 Exaflops Supercomputer?

UPDATED* Perhaps Janet Jackson should be the official spokesperson of the supercomputing industry.

The US Department of Energy has a single 2 exaflops system up and running – well, most of it anyway – and that of course is the “Frontier” system at Oak Ridge National Laboratory and two more slated for delivery, and that is the “Aurora” system at Argonne National Laboratory supposedly coming sometime this year and the “El Capitan” system at Lawrence Livermore National Laboratory, which is due next year. It took a lot of money and sweat to get these machines into the field – in Intel’s case, the sweat to money ratio has been pretty high given the four-year delay and massive architectural changes involved the latest and final incarnation of Aurora.

This week, the DOE put out a request for information for advanced computing systems, with Oak Ridge riding point to get the vendor community to submit its thoughts for the next generation supercomputers it expects to install sometime between 2025 and 2030 and it expects to have somewhere between 5X and 10X the performance of real-world scientific problems it tackles today, or have more oomph to take on more complex physical simulations or do ones with higher resolution and fidelity. The RFI will give the DOE the base information from which it can begin evaluating the solution space for system development from 10 exaflops to 100 exaflops and try to figure out the kinds of options it has and what research and development will be necessary to get at least two vendors building systems (if history is any guide).

The RFI is illustrative in many ways, and this particular paragraph sticks out:

“Include, if needed, high-level considerations of the balance between traditional HPC (FP64) needs and AI (BF16/FP16) needs; Include considerations, if needed, of architecture optimizations for large-scale AI training (100 trillion parameter models); domain specific architectures (e.g., for HPC+AI surrogates and hybrid classical–quantum deployments). Our rough estimate of targets includes traditional HPC (based upon past trends over the past 20 years) systems at the 10–20 FP64 exaflops level and beyond in the 2025+ timeframe and 100+ FP64 exaflops and beyond in the 2030+ timeframe through hardware and software acceleration mechanisms. This is roughly 8 times more than 2022 systems in 2026 and 64 times more in 2030. For AI applications, we would be interested in BF16/FP16 performance projections, based on current architectures, and would expect additional factors of 8 to 16 times or beyond the FP64 rates for lower precision.”

Elsewhere in the RFI, the DOE says the machines have to fit somewhere in a power envelope of between 20 megawatts and 60 megawatts – which probably means you design like crazy for 50 megawatts and it comes pretty close to 60 megawatts.

If you are cynical like that – and sometimes you have to be when facing down the slowing of Moore’s Law – then you can already get 2X performance with today’s technology just by scaling the power. Frontier weights in at 29 megawatts when fully burdened, so the first 4 exaflops is easy. Just double the size of the system and count on software engineers to figure out how to scale the code.

If you want to build a 10 exaflops to 20 exaflops system in the same power envelope and within the same $500 million budget of Frontier, which the RFI from the DOE suggests gently is a good idea, then you have a real engineering task ahead of you. And frankly, that should be the goal for advanced supercomputing systems – do a lot more with the same money and power. Enterprise IT has to do more with less all the time, while HPC has to try to rein in the power and budget. This may not be possible, of course, given the limits of semiconductor and system manufacturing.

The money is as big of a problem here as are Moore’s Law issues and coping with sheer scale, so let’s talk about money here for a second.

We had the frank discussion about the money behind exascale and beyond recently with Jeff Nichols, who is spending his last day today (June 30) as associate laboratory director for Computing and Computational Sciences at Oak Ridge, and we did some more cost projections for capability class machines when covering the “Cactus” and “Dogwood” systems that the National Oceanic and Atmospheric Administration in the United States just turned on this week for running models at the National Weather Service for weather forecasts. The 3X boost in performance that NOAA has is great, but as we pointed out, NOAA needs something more like a 3,300X increase in performance to move from 13 kilometer resolution forecasts down to 1 kilometer – and even more to get below that 1 kilometer threshold where you can actually simulate individual clouds. And that would be a serious engineering challenge – and something within the scope of the RFI that Oak Ridge just put out, by the way. Probably somewhere around 9.3 exaflops to reach 1 kilometer resolution and maybe 11.5 exaflops to reach 800 meters, which is 4,096X the compute to increase the resolution by a factor of 16X.

Money is a real issue as far as we are concerned. The rest is just engineering around the money. Let’s take the inflation adjusted budgets for the machines at Oak Ridge for the past decade against their peak FP64 performance:

  • The 1.75 petaflops “Jaguar” system cost $82 million per petaflops
  • The 17.6 petaflops “Titan” system cost $6.5 million per petaflops
  • The 200 petaflops “Summit” machine cost a little more than $1 million per petaflops
  • The new 2 exaflops “Frontier” machine cost $400,000 per petaflops.

That is a factor of 16X improvement in the cost per petaflops between 2012 and 2022. Can the cost per petaflops by 2032 be driven down by another factor of 16X? Can it really go down to $25,000 per petaflops by 2032, which implies somewhere around $50,000 per petaflops by 2027, halfway between then and now? That would be $500 million for a 10 exaflops machine, based on the accelerated architectures outlined above, in 2027 and $250 million for one in 2032. That also implies $2.5 billion for a 100 exaflops machine in 2032. And that implies a GPU accelerated architecture. You can forget about all-CPU machines unless CPUs start looking a lot more like GPUs — which is an option, as the A64FX processor from Fujitsu shows. But still, an all-CPU machine like the “Fukagu” machine at RIKEN Lab in Japan, cost $980 million to deliver 537.2 petaflops peak, and that is $1.82 million per petaflops. That is 4.6X more expensive per peak flops than Frontier. To be fair, Fukagu, like the K system before it at RIKEN, is an integrated design that performs well and is more computationally efficiency than hybrid CPU-GPU designs. But Japan pays heavily for that.

You can see why everyone wants to figure out how to use AI to stretch physics simulations and to use AI’s mixed precision engines to goose HPC performance, and you can see why the big machines are based on accelerators for the most part using some sort of accelerator.

Using mixed precision, iterative solvers like that used in the HPL-AI benchmark boost the effective performance of the machine by anywhere from 3X to 5X as well. That’s not using AI to boost HPC, and this is an important distinction that people sometimes miss. (It would have been helpful if this benchmark was called High Performance Linpack -Mixed Precision, or HPL-MP, instead of HPL-AI because the very name gives people the wrong impression of what is going on.)

Anyway. Back in November 2019, when Oak Ridge first used the iterative solver on the “Summit” supercomputer, it took all of Summit and got 148.6 petaflops using the standard HPL test in FP64 floating point mode on its vector engines. And with the iterative solver, which used a variety of mixed precision math to converge to the same answer, it was able to get an effective rate of 450 petaflops. (In other words, it got the answer 3X faster, and therefore it would have taken a run using only FP64 math 3X the time to do the same work.)

A lot of high-end machines have been tested using the HPL-AI benchmark and its iterative solvers, which have been refined since that time. On the June HPL-AI list, the refined solvers are able to get a 9.5X speedup on Summit, making it behave like a 1.41 exaflops machine. (This is what my dad used to call getting 10 pounds of manure in a 1 pound bag. He did not say “manure” even once in his life. . . .) On Frontier, which is a completely different architecture, a dominate slice of the machine is rated at 1.6 exaflops peak, 1.1 exaflops on HPL, and 6.86 exaflops on HPL-AI, and there is no reason to believe the effective flops can be boosted by around 10X, which has happened on a machines with very different architectures.

The question is, can this iterative, mixed precision solver approach be used in real-world HPC applications to the same effect? And the answer is simple: Yes, because it has to.

The next question is: Will we “cheat” what we call 20 exaflops by counting the effective performance of using iterative solvers at the heart of simulations? We don’t think you can, because different applications will be more or less amenable to this technique. If this could be done, applications running on Frontier would already be close to 10 exaflops of effective performance. We could go retire, like Nichols. (Who did more than his fair share of conjuring massive machines into existence. Have a great retirement, Jeff.)

Either way, can’t just be thinking of the actual and effective FP64 rates on these machines. The mic drop in that paragraph from the RFI above is the need to train AI models with 100 trillion parameters.

Right now, the GPT-3 model with 3 billion parameters is not particularly useful, and more and more AI shops are training the GPT-3 model with 175 billion parameters. That is according to Rodrigo Liang, chief executive officer at SambaNova Systems, who we just talked to yesterday for another story we are working on. The hyperscalers are trying to crack 1 trillion parameters, and 100X more than that sounds absolutely as insane as it is probably necessary given what HPC centers are trying to do for the next decade.

The “Aldebaran” Instinct MI250X GPU accelerator used in Frontier does not have support for FP8 floating point precision, so it cannot boost the parameter count and AI training throughput by dropping the resolution on the training data. Nvidia has FP8 support in the “Hopper” H100 GPU accelerator, and AMD will have it in the un-codenamed AMD MI300A GPU accelerator used in the El Capitan supercomputer. This helps. There may be a way to push training down to FP4, also boosting the effective throughput by 2X for certain kinds of training. But we think HPC centers in particular want to do training on high resolution data, not low resolution data. So anything shy of FP16 is probably not all that useful.

Here is what we are up against with this 100 trillion parameter AI model. Researchers at Stanford University, Microsoft, and Nvidia showed in a paper last year that they could train a natural language processing (NLP) model with 1 trillion parameters on a cluster with 3,072 Nvidia A100 GPUs running at 52 percent of peak theoretical performance of 965.4 petaflops at FP16 precision, which works out to 241.3 petaflops at FP64 precision. If things are linear, then to do 100 trillion parameters at FP16 precision would require a machine with 24.1 exaflops of oomph. And if for some reason the HPC centers want their data to stay at FP64 precision – and there is a good chance that they will in many cases – then we are talking about a machine with 96.5 exaflops. That would be 307,200 A100 GPUs and 187,200 H100 GPUs, and if the data stayed in FP64 format, you would need 1.23 million A100 GPUs and 748,800 H100 GPUs. We realize that making a parameter comparison between NLP models and HPC models (which will probably be more like visual processing than anything) is dubious, but we wanted to get a sense of the scale that we might be talking about here for 100 trillion parameters.

Even if you assume HPC centers could use sparsity feature support on the Tensor Cores instead of the FP16 on the vector cores on an Nvidia GPU – which is not always possible because some matrices in HPC are dense and that is that – that would still be 11,232 GPUs for FP16 formats and 44,928 GPUs for FP64 formats. But HPC centers have to plan for dense matrices, too. And if you think that Nvidia can double the compute density of its devices in the next decade – which is reasonable given that it has done that – the number of streaming multiprocessors (SMs) is going to go up – we would say go through the roof – even if the machine might only need 5,000 or 6,000 GPU accelerators for sparse data. You will still need somewhere 93,600 GPUs for dense matrix processing and the level of concurrency across those SMs will be on the order of 100 million as Nvidia adopts chiplet designs and pushes packaging. (Which we think Nvidia will do because its researchers have been working on it.) If 93,600 of those future GPUs around 2032 cost around $20,000 a piece, these alone would be $1.87 billion. Most of the hypothetical $2.3 billion budget we talked about above.

AMD will be eager to keep winning these deals, of course, and with Nvidia not exactly being aggressive in HPC because it owns the AI market, AMD won’t have to worry about Nvidia too much. Intel is another story, and it will be perfectly happy to lose money on HPC deals to show its prowess. Aurora has not been a good story for Intel’s HPC aspirations, on many fronts.

We can’t wait to see how vendors respond – but this will not be data that is shared with the public. These RFIs and their RFP follow-ons are Fight Club.

Here is one other interesting bit in the RFI coming out of Oak Ridge: The DOE wants to consider modular approaches to HPC systems, which can be upgraded over time, rather than dropping in a machine every four or five years. (It used to be three to four years, but that is going to stretch because of the enormous cost of these future machines.)

“We also wish to explore the development of an approach that moves away from monolithic acquisitions toward a model for enabling more rapid upgrade cycles of deployed systems, to enable faster innovation on hardware and software,” the DOE RFI states. “One possible strategy would include increased reuse of existing infrastructure so that the upgrades are modular. A goal would be to reimagine systems architecture and an efficient acquisition process that allows continuous injection of technological advances to a facility (e.g., every 12–24 months rather than every 4–5 years). Understanding the tradeoffs of these approaches is one goal of this RFI, and we invite responses to include perceived benefits and/or disadvantages of this modular upgrade approach.”

We think this means disaggregated and composable infrastructure, so trays or CPUs, GPUs, memory, and storage can all be swapped in and out as faster and more energy efficient kickers become available. But, upgradeability is a nice dream but may not be particularly practical for capability class supercomputers.

First, swapping out components means spending all of that component money again, which is why we did not see the K supercomputer get upgraded at RIKEN Lab in Japan as Fujitsu rolled out generation after generation of Sparc64 processors. This is why we probably will not see new generations of A64FX processors, unless something happens in the HPC market in Japan that bucks history. Ditto for any machine based on Nvidia, AMD, or Intel GPUs. Touching a machine that is running is risky enough, but having to pay for it again and again is almost certainly not going to be feasible. Unless the governments of the world, and the DOE in particular in this RFI case, has a plan to put in a fifth of a machine every five years, all with mixed components. But that causes its own problems because you cannot get machines to work efficiently in lockstep when they finish bits of work at different rates.

This is going to be a lot harder than slapping together 100,000 GPU accelerators with 100 million SMs, with 1.6 Tb/sec Ethernet and InfiniBand interconnects, with CXL 5.0 links linking together CPU hosts and maybe a dozen GPUs into a node using 1 TB/sec PCI-Express 8.0 x16 links. But that’s probably where we would start and work our way backwards from there.

Update: We slipped a decimal point in the cost for machines in 2027 and 2032 due to a spreadsheet error. Our thanks to Jack Dongarra for catching it.

Thu, 30 Jun 2022 03:40:00 -0500 Timothy Prickett Morgan en-US text/html
LOT-402 exam dump and training guide direct download
Training Exams List