Download M9510-726 free test prep with study guide IBM Rational DevOps Sales Mastery Test v1 Certification exam guides are setup by IT specialists. We have a group of M9510-726 guaranteed individuals that cooperate to make a tremendous assortment of M9510-726 actual test inquiries for the possibility to simply go through and finish their test. Simply retaining the M9510-726 practice questions that we give is adequate to finish the M9510-726 test at the absolute first endeavor.

Exam Code: M9510-726 Practice exam 2022 by team
IBM Rational DevOps Sales Mastery Test v1
IBM Rational approach
Killexams : IBM Rational approach - BingNews Search results Killexams : IBM Rational approach - BingNews Killexams : IBM is Modeling New AI After the Human Brain

Attentive Robots

Currently, artificial intelligence (AI) technologies are able to exhibit seemingly-human traits. Some are intentionally humanoid, and others perform tasks that we normally associate strictly with humanity — songwriting, teaching, and visual art.

But as the field progresses, companies and developers are re-thinking the basis of artificial intelligence by examining our own intelligence and how we might effectively mimic it using machinery and software. IBM is one such company, as they have embarked on the ambitious quest to teach AI to act more like the human brain.

Click to View Full Infographic

Many existing machine learning systems are built around the need to draw from sets of data. Whether they are problem-solving to win a game of Go or identifying skin cancer from images, this often remains true. This basis is, however, limited — and it differentiates from the human brain.

We as humans learn incrementally. Simply put, we learn as we go. While we acquire knowledge to pull from as we go along, our brains adapt and absorb information differently from the way that many existing artificial systems are built. Additionally, we are logical. We use reasoning skills and logic to problem solve, something that these systems aren't yet terrific at accomplishing.

IBM is looking to change this. A research team at DeepMind has created a synthetic neural network that reportedly uses rational reasoning to complete tasks.

Rational Machinery

By giving the AI multiple objects and a specific task, "We are explicitly forcing the network to discover the relationships that exist," says Timothy Lillicrap, a computer scientist at DeepMind in an interview with Science Magazine. In a test of the network back in June, it was questioned about an image with multiple objects. The network was asked, for example: "There is an object in front of the blue thing; does it have the same shape as the tiny cyan thing that is to the right of the gray metal ball?"

In this test, the network correctly identified the object a staggering 96 percent of the time, compared to the measly 42 to 77 percent that more traditional machine learning models achieved. The advanced network was also apt at word problems and continues to be developed and improved upon. In addition to reasoning skills, researchers are advancing the network's ability to pay attention and even make and store memories.

Image Credit: ColiN00B / Pixabay

The future of AI development could be hastened and greatly expanded by using such tactics, according to Irina Rish, an IBM research staff member, in an interview with Engadget, "Neural network learning is typically engineered and it's a lot of work to actually come up with a specific architecture that works best. It's pretty much a trial and error approach ... It would be good if those networks could build themselves."

It might be scary to think of AI networks building and improving themselves, but if monitored, initiated, and controlled correctly, this could allow the field to expand beyond current limitations. Despite the brimming fears of a robot takeover, the advancement of AI technologies could save lives in the medical field, allow humans to get to Mars, and so much more. 

Wed, 29 Dec 2021 18:58:00 -0600 text/html
Killexams : IBM, NI Plug Systems Engineering Gap

With the number of lines of code in the average car expected to skyrocket from 10 million in 2010 to 100 million in 2030, there's no getting around the fact that embedded software development and a systems engineering approach has become central not only to automotive design, but to product design in general.

Yet despite the invigorated focus on what is essentially a long-standing design process, organizations still struggle with siloed systems and engineering processes that stand in the way of true systems engineering spanning mechanical, electrical, and software functions. In an attempt to address some of those hurdles, IBM and National Instruments are partnering to break down the silos specifically as they relate to the quality management engineering system workflow, or more colloquially, the marriage between design and test.

"As customers go through iterative development cycles, whether they're building a physical product or a software subsystem, and get to some level of prototype testing, they run into a brick wall around the manual handoff between the development and test side," Mark Lefebvre, director, systems alliances and integrations, for IBM Rational, told us. "Traditionally, these siloed processes never communicate and what happens is they find errors downstream in the software development process when it is more costly to fix."

NI and IBM's answer to this gap? The pair is building a bridge -- specifically an integration between IBM Rational Quality Manager test management and quality management tool, and NI's VeriStand and TestStand real-time testing and test-automation environment. The integration, Lefebvre said, is designed to plug the gap and provide full traceability of what's defined on the test floor back to design and development, enabling more iterative testing throughout the lifecycle and uncovering errors earlier in the process, well before building costly prototypes.

The ability to break down the quality management silos and facilitate earlier collaboration can have a huge impact on cost if you look at the numbers IBM Rational is touting. According to Lefebvre, a bug that costs $1 to fix on a programmer's desktop costs $100 to fix once it makes its way into a complete program and many thousands of dollars once identified after the software has been deployed in the field.

While the integration isn't yet commercialized (Lefebvre said to expect it at the end of the third quarter), there is a proof of concept being tested with five or six big NI/IBM customers. The proof of concept is focused on the development of an embedded control unit (ECU) for a cruise control system that could operate across multiple vehicle platforms. The workflow exhibited marries the software development test processes to the hardware module test processes, from the requirements stage through quality management, so if a test fails or changes are made to the code, the results are shared throughout the development lifecycle.

Prior to such an integration, any kind of data sharing was limited to manual processes around Word documents and spreadsheets, Lefebvre said. "Typically, a software engineer would hand carry all the data in a spreadsheet and import it into the test environment. Now there's a pipe connecting the two."

Related posts:

Wed, 06 Jul 2022 12:00:00 -0500 en text/html
Killexams : Book: Model Driven Architecture and Ontology Development

Model Driven Architecture and Ontology Development Defining a formal domain ontology is generally considered a useful, not to say necessary step in almost every software project. This is because software deals with ideas rather than with self-evident physical artefacts. However, this development step is hardly ever done, as ontologies rely on well-defined and semantically powerful AI concepts such as description logics or rule-based systems, and most software engineers are largely unfamiliar with these. Gasevic and his co-authors try to fill this gap by covering the subject of MDA application for ontology development on the Semantic Web. Part I of their book describes existing technologies, tools, and standards like XML, RDF, OWL, MDA, and UML. Part II presents the first detailed description of OMG's new ODM (Ontology Definition Metamodel) initiative, a specification which is expected to be in the form of an OMG language like UML. Finally, Part III is dedicated to applications and practical aspects of developing ontologies using MDA-based languages. The book is supported by a website showing many ontologies, UML and other MDA-based models, and the transformations between them.

“The book is equally suited to those who merely want to be informed of the relevant technological landscape, to practitioners dealing with concrete problems, and to researchers seeking pointers to potentially fruitful areas of research. The writing is technical yet clear and accessible, illustrated throughout with useful and easily digestible examples.” from the Foreword by Bran Selic, IBM Rational Software, Canada.

“I do not know another book that offers such a high quality insight into UML and ontologies.” Steffen Staab, U Koblenz, Germany

Contents: Part I: Basics - Knowledge Representation - Ontologies - Semantic Web - Model Driven Architecture - Modeling Spaces.- Part II: Model Driven Architecture and Ontologies - Software Engineering Approaches for Ontology Development - MDA-Based Ontology Infrastructure - Ontology Definition Metamodel - Ontology UML Profile - Mappings of MDA Based Languages and Ontologies.- Part III: Application - Using UML Tools for Ontology Modeling - MDA Based Ontology Platform: AIR - Ontology Examples.

Sat, 04 Dec 2021 19:18:00 -0600 text/html
Killexams : A guide to continuous testing tools

Mobile Labs: Mobile Labs remains the leading supplier of in-house mobile device clouds that connect remote, shared devices to Global 2000 mobile web, gaming, and app engineering teams. Its patented GigaFox is offered on-premises or hosted, and solves mobile device sharing and management challenges during development, debugging, manual testing, and automated testing. A pre-installed and pre-configured Appium server provides “instant on” Appium test automation.

RELATED CONTENT: Testing all the time

NowSecure: NowSecure is the mobile app security software company trusted by
the world’s most demanding organizations. Only the NowSecure Platform delivers
fully automated mobile app security and privacy testing with the speed, accuracy,
and efficiency necessary for Agile and DevSecOps environments. Through the
industry’s most advanced static, dynamic, behavioral and interactive mobile app
security testing on real Android and iOS devices, NowSecure identifies the broadest array of security threats, compliance gaps and privacy issues in custom-developed, commercial, and business-critical mobile apps. NowSecure customers can choose automated software on-premises or in the cloud, expert professional penetration testing and managed services, or a combination of all as needed. NowSecure offers the fastest path to deeper mobile app security and privacy testing and certification.

Parasoft: Parasoft’s software testing tool suite automates time-consuming testing tasks for developers and testers, and helps managers and team leaders pinpoint priorities. With solutions that are easy to use, adopt, and scale, Parasoft’s software testing tools fit right into your existing toolchain and shrink testing time with nextlevel efficiency, augmented with AI. Parasoft users are able to succeed in today’s most strategic development initiatives, to capture new growth opportunities and meet the growing expectations of consumer demands.

Perfecto: Perfecto offers a cloud-based continuous testing platform that takes
mobile and web testing to the next level. It features a: continuous quality lab with
smart self-healing capabilities; test authoring, management, validations and debugging of even advanced and hard-to-test businesses scenarios; text execution simulations; and smart analysis. For mobile testing, users can test against more than 3,000 real devices, and web developers can boost their test portfolio with cross-browser testing in the cloud.

CA Technologies offers next-generation, integrated continuous testing solutions that automate the most difficult testing activities — from requirements engineering through test design automation, service virtualization and intelligent orchestration. Built on end-to-end integrations and open source, CA’s comprehensive solutions help organizations eliminate testing bottlenecks impacting their DevOps and continuous delivery practices to test at the speed of agile, and build better apps, faster.

HPE Software’s automated testing solutions simplify software testing within fastmoving agile teams and for Continuous Integration scenarios. Integrated with DevOps tools and ALM solutions, HPE automated testing solutions keep quality at the center of today’s modern applications and hybrid infrastructures. 

IBM: Quality is essential and the combination of automated testing and service virtualization from IBM Rational Test Workbench allows teams to assess their software throughout their delivery lifecycle. IBM has a market leading solution for the continuous testing of end-to-end scenarios covering mobile, cloud, cognitive, mainframe and more. 

Micro Focus is a leading global enterprise software company with a world-class testing portfolio that helps customers accelerate their application delivery and ensure quality and security at every stage of the application lifecycle — from the first backlog item to the user experience in production. Simplifying functional, mobile, performance and application security within fast-moving Agile teams and for DevOps, Micro Focus testing solutions keep quality at the center of today’s modern applications and hybrid infrastructures with an integrated end-to-end application lifecycle management solution that is built for any methodology, technology and delivery model. 

Microsoft provides a specialized tool set for testers that delivers an integrated experience starting from agile planning to test and release management, on premises or in the cloud. 

Orasi is a leading provider of software testing services, utilizing test management, test automation, enterprise testing, Continuous Delivery, monitoring, and mobile testing technology. 

Progress: Telerik Test Studio is a test automation solution that helps teams be more efficient in functional, performance and load testing, improving test coverage and reducing the number of bugs that slip into production. 

QASymphony’s qTest is a Test Case Management solution that integrates with popular development tools. QASymphony offers qTest eXplorer for teams doing exploratory testing. 

Rogue Wave is the largest independent provider of cross-platform software development tools and embedded components in the world. Rogue Wave Software’s Klocwork boosts software security and creates more reliable software. With Klocwork, analyze static code on-the-fly, simplify peer code reviews, and extend the life of complex software. Thousands of customers, including the biggest brands in the automotive, mobile device, consumer electronics, medical technologies, telecom, military and aerospace sectors, make Klocwork part of their software development process. 

Sauce Labs provides the world’s largest cloud-based platform for automated testing of web and mobile applications. Optimized for use in CI and CD environments, and built with an emphasis on security, reliability and scalability, users can run tests written in any language or framework using Selenium or Appium, both widely adopted open-source standards for automating browser and mobile application functionality.

SmartBear provides a range of frictionless tools to help testers and developers deliver robust test automation strategies. With powerful test planning, test creation, test data management, test execution, and test environment solutions, SmartBear is paving the way for teams to deliver automated quality at both the UI and API layer. SmartBear automation tools ensure functional, performance, and security correctness within your deployment process, integrating with tools like Jenkins, TeamCity, and more. 

SOASTA’s Digital Performance Management (DPM) Platform enables measurement, testing and improvement of digital performance. It includes five technologies: mPulse real user monitoring (RUM); the CloudTest platform for continuous load testing; TouchTest mobile functional test automation; Digital Operation Center (DOC) for a unified view of contextual intelligence accessible from any device; and Data Science Workbench, simplifying analysis of current and historical web and mobile user performance data. 

Synopsys: Through its Software Integrity platform, Synopsys provides a comprehensive suite of testing solutions for rapidly finding and fixing critical security vulnerabilities, quality defects, and compliance issues throughout the SDLC. 

TechExcel: DevTest is a sophisticated quality-management solution used by development and QA teams of all sizes to manage every aspect of their testing processes. 

Testplant: Eggplant’s Digital Automation Intelligence Suite empowers teams to continuously create amazing, user-centric digital experiences by testing the true UX, not the code. 

Tricentis is recognized by both Forrester and Gartner as a leader in software test automation, functional testing, and continuous testing. Our integrated software testing solution, Tricentis Tosca, provides a unique Model-based Test Automation and Test Case Design approach to functional test automation—encompassing risk-based testing, test data management and provisioning, service virtualization, API testing and more.

Thu, 30 Jun 2022 11:59:00 -0500 en-US text/html
Killexams : IBM Rational Software | Stock Prices | Quote Comparison - Yahoo Finance





My List

Your list is empty. Add symbols to get relevant news

Fri, 22 Jul 2022 12:00:00 -0500 en-US text/html
Killexams : Comprehensive Change Management for SoC Design By Sunita Chulani1, Stanley M. Sutton Jr.1, Gary Bachelor2, and P. Santhanam1
1 IBM T. J. Watson Research Center, 19 Skyline Drive, Hawthorne, NY 10532 USA
2 IBM Global Business Services, PO BOX 31, Birmingham Road, Warwick CV34 5JL UK


Systems-on-a-Chip (SoC) are becoming increasingly complex, leading to corresponding increases in the complexity and cost of SoC design and development.  We propose to address this problem by introducing comprehensive change management.  Change management, which is widely used in the software industry, involves controlling when and where changes can be introduced into components so that changes can be propagated quickly, completely, and correctly.
In this paper we address two main topics:   One is typical scenarios in electronic design where change management can be supported and leveraged. The other is the specification of a comprehensive schema to illustrate the varieties of data and relationships that are important for change management in SoC design.


SoC designs are becoming increasingly complex.  Pressures on design teams and project managers are rising because of shorter times to market, more complex technology, more complex organizations, and geographically dispersed multi-partner teams with varied “business models” and higher “cost of failure.”

Current methodology and tools for designing SoC need to evolve with market demands in key areas:  First, multiple streams of inconsistent hardware (HW) and software (SW) processes are often integrated only in the late stages of a project, leading to unrecognized divergence of requirements, platforms, and IP, resulting in unacceptable risk in cost, schedule, and quality.  Second, even within a stream of HW or SW, there is inadequate data integration, configuration management, and change control across life cycle artifacts.  Techniques used for these are often ad hoc or manual, and the cost of failure is high.  This makes it difficult for a distributed group team     to be productive and inhibits the early, controlled reuse of design products and IP.  Finally, the costs of deploying and managing separate dedicated systems and infrastructures are becoming prohibitive.

We propose to address these shortcomings through comprehensive change management, which is the integrated application of configuration management, version control, and change control across software and hardware design.  Change management is widely practiced in the software development industry.  There are commercial change-management systems available for use in electronic design, such as MatrixOne DesignSync [4], ClioSoft SOS [2], IC Manage Design Management [3], and Rational ClearCase/ClearQuest [1], as well as numerous proprietary, “home-grown” systems.  But to date change management remains an under-utilized technology in electronic design.

In SoC design, change management can help with many problems.  For instance, when IP is modified, change management can help in identifying blocks in which the IP is used, in evaluating other affected design elements, and in determining which tests must be rerun and which rules must be re-verified. Or, when a new release is proposed, change management can help in assessing whether the elements of the release are mutually consistent and in specifying IP or other resources on which the new release depends.

More generally, change management gives the ability to analyze the potential impact of changes by tracing to affected entities and the ability to propagate changes completely, correctly, and efficiently.  For design managers, this supports decision-making as to whether, when, and how to make or accept changes.  For design engineers, it helps in assessing when a set of design entities is complete and consistent and in deciding when it is safe to make (or adopt) a new release.

In this paper we focus on two elements of this approach for SoC design.  One is the specification of representative use cases in which change management plays a critical role.  These show places in the SoC development process where information important for managing change can be gathered.  They also show places where appropriate information can be used to manage the impact of change.  The second element is the specification of a generic schema for modeling design entities and their interrelationships.  This supports traceability among design elements, allows designers to analyze the impact of changes, and facilitates the efficient and comprehensive propagation of changes to affected elements.

The following section provides some background on a survey of subject-matter experts that we performed to refine the problem definition.     


We surveyed some 30 IBM subject-matter experts (SMEs) in electronic design, change management, and design data modeling.  They identified 26 problem areas for change management in electronic design.  We categorized these as follows:

  • visibility into project status
  • day-to-day control of project activities
  • organizational or structural changes
  • design method consistency
  • design data consistency

Major themes that crosscut these included:

  • visibility and status of data
  • comprehensive change management
  • method definition, tracking, and enforcement
  • design physical quality
  • common approach to problem identification and handling

We held a workshop with the SMEs to prioritize these problems, and two emerged     as the most significant:  First, the need for basic management of the configuration of all the design data and resources of concern within a project or work package (libraries, designs, code, tools, test suites, etc.); second, the need for designer visibility into the status of data and configurations in a work package.

To realize these goals, two basic kinds of information are necessary:  1) An understanding of how change management may occur in SoC design processes; 2) An understanding of the kinds of information and relationships needed to manage change in SoC design.  We addressed the former by specifying change-management use cases; we addressed the latter by specifying a change-management schema.


This section describes typical use cases in the SoC design process.  Change is a pervasive concern in these use cases—they cause changes, respond to changes, or depend on data and other resources that are subject to change.  Thus, change management is integral to the effective execution of each of these use cases. We identified nine representative use cases in the SoC design process, which are shown in Figure 1.

Figure 1.  Use cases in SoC design

In general there are four ways of initiating a project: New Project, Derive, Merge and Retarget.  New Project is the case in which a new project is created from the beginning.  The Derive case is initiated when a new business opportunity arises to base a new project on an existing design. The Merge case is initiated when an actor wants to merge configuration items during implementation of a new change management scheme or while working with teams/organizations outside of the current scheme. The Retarget case is initiated when a project is restructured due to resource or other constraints.  In all of these use cases it is important to institute proper change controls from the outset.  New Project starts with a clean slate; the other scenarios require changes from (or to) existing projects.    

Once the project is initiated, the next phase is to update the design. There are two use cases in the Update Design composite state.  New Design Elements addresses the original creation of new design elements.  These become new entries in the change-management system.  The Implement Change use case entails the modification of an existing design element (such as fixing a bug).  It is triggered in response to a change request and is supported and governed by change-management data and protocols.

The next phase is the Resolve Project and consists of 3 use cases. Backout is the use case by which changes that were made in the previous phase can be reversed.  Release is the use case by which a project is released for cross functional use. The Archive use case protects design asset by secure copy of design and environment.


The main goal of the change-management schema is to enable the capture of all information that might contribute to change management

4.1     Overview

The schema, which is defined in the Unified Modeling Language (UML) [5], consists of several high-level packages (Figure 2).

Click to enlarge

Figure 2.  Packages in the change-management schema

Package Data represents types for design data and metadata.  Package Objects and Data defines types for objects and data.  Objects are containers for information, data represent the information.  The main types of object include artifacts (such as files), features, and attributes.  The types of objects and data defined are important for change management because they represent the principle work products of electronic design: IP, VHDL and RTL specifications, floor plans, formal verification rules, timing rules, and so on.  It is changes to these things for which management is most needed.

The package Types defines types to represent the types of objects and data.  This enables some types in the schema (such as those for attributes, collections, and relationships) to be defined parametrically in terms of other types, which promotes generality, consistency, and reusability of schema elements.

Package Attributes defines specific types of attribute.  The basic attribute is just a name-value pair that is associated to an object.  (More strongly-typed subtypes of attribute have fixed names, value types, attributed-object types, or combinations of these.)  Attributes are one of the main types of design data, and they are important for change management because they can represent the status or state of design elements (such as version number, verification level, or timing characteristics).

Package Collections defines types of collections, including collections with varying degrees of structure, typing, and constraints.  Collections are important for change management in that changes must often be coordinated for collections of design elements as a group (e.g., for a work package, verification suite, or IP release).  Collections are also used in defining other elements in the schema (for example, baselines and change sets).

The package Relationships defines types of relationships.  The basic relationship type is an ordered collection of a fixed number of elements.  Subtypes provide directionality, element typing, and additional semantics.  Relationships are important for change management because they can define various types of dependencies among design data and resources.  Examples include the use of macros in cores, the dependence of timing reports on floor plans and timing contracts, and the dependence of test results on tested designs, test cases, and test tools.  Explicit dependency relationships support the analysis of change impact and the efficient and precise propagation of changes,

The package Specifications defines types of data specification and definition.  Specifications specify an informational entity; definitions denote a meaning and are used in specifications.

Package Resources represents things (other than design data) that are used in design processes, for example, design tools, IP, design methods, and design engineers.  Resources are important for change management in that resources are used in the actions that cause changes and in the actions that respond to changes.  Indeed, minimizing the resources needed to handle changes is one of the goals of change management.

Resources are also important in that changes to a resource may require changes to design elements that were created using that resource (for example, when changes to a simulator may require reproduction of simulation results).

Package Events defines types and instances of events.  Events are important in change management because changes are a kind of event, and signals of change events can trigger processes to handle the change.

The package Actions provides a representation for things that are done, that is, for the behaviors or executions of tools, scripts, tasks, method steps, etc.  Actions are important for change in that actions cause change.  Actions can also be triggered in response to changes and can handle changes (such as by propagating changes to dependent artifacts).

Subpackage Action Definitions defines the type Action Execution, which contains information about a particular execution of a particular action.  It refers to the definition of the action and to the specific artifacts and attributes read and written, resources used, and events generated and handled.  Thus an action execution indicates particular artifacts and attributes that are changed, and it links those to the particular process or activity by which they were changed, the particular artifacts and attributes on which the changes were based, and the particular resources by which the changes were effected.  Through this, particular dependency relationships can be established between the objects, data, and resources.  This is the specific information needed to analyze and propagate concrete changes to artifacts, processes, resources.

Package Baselines defines types for defining mutually consistent set of design artifacts. Baselines are important for change management in several respects.  The elements in a baseline must be protected from arbitrary changes that might disrupt their mutual consistency, and the elements in a baseline must be changed in mutually consistent ways in order to evolve a baseline from one version to another.

The final package in Figure 2 is the Change package.  It defines types that for representing change explicitly.  These include managed objects, which are objects with an associated change log, change logs and change sets, which are types of collection that contain change records, and change records, which record specific changes to specific objects.  They can include a reference to an action execution that caused the change

The subpackage Change Requests includes types for modeling change requests and responses.  A change request has a type, description, state, priority, and owner.  It can have an associated action definition, which may be the definition of the action to be taken in processing the change request.  A change request also has a change-request history log.

4.2    Example

An example of the schema is shown in Figure 3.  The clear boxes (upper part of diagram) show general types from the schema and the shaded boxes (lower part of the diagram) show types (and a few instances) defined for a specific high-level design process project at IBM.

Click to enlarge

Figure 3.  Example of change-management data

The figure shows a dependency relationship between two types of design artifact, VHDLArtifact and FloorPlannableObjects.  The relationship is defined in terms of a compiler that derives instances of FloorPlannableObjects from instances of VHDLArtifact.  Execution of the compiler constitutes an action that defines the relationship.  The specific schema elements are defined based on the general schema using a variety of object-oriented modeling techniques, including subtyping (e.g., VHDLArtifact), instantiation (e.g., Compile1) and parameterization (e.g. VHDLFloorplannable ObjectsDependency).


Here we present an example use case, Implement Change, with details on its activities and how the activities use the schema presented in Section 4.  This use case is illustrated in Figure 4.

Click to enlarge

Figure 4.  State diagram for use case Implement Change

The Implement Change use case addresses the modification of an existing design element (such as fixing a bug).  It is triggered by a change request.  The first steps of this use case are to identify and evaluate the change request to be handled.  Then the relevant baseline is located, loaded into the engineer’s workspace, and verified.  At this point the change can be implemented.  This begins with the identification of the artifacts that are immediately affected.  Then dependent artifacts are identified and changes propagated according to dependency relationships.  (This may entail several iterations.)  Once a stable state is achieved, the modified artifacts are Tested and regression tested.  Depending on test results, more changes may be required.  Once the change is considered acceptable, any learning and metrics from the process are captured and the new artifacts and relationships are promoted to the public configuration space.


This paper explores the role of comprehensive change management in SoC design, development, and delivery.  Based on the comments of over thirty experienced electronic design engineers from across IBM, we have captured the essential problems and motivations for change management in SoC projects. We have described design scenarios, highlighting places where change management applies, and presented a preliminary schema to show the range of data and relationships change management may incorporate.  Change management can benefit both design managers and engineers.  It is increasingly essential for improving productivity and reducing time and cost in SoC projects.


Contributions to this work were also made by Nadav Golbandi and Yoav Rubin of IBM’s Haifa Research Lab.  Much information and guidance were provided by Jeff Staten and Bernd-josef Huettl of IBM’s Systems and Technology Group. We especially thank Richard Bell, John Coiner, Mark Firstenberg, Andrew Mirsky, Gary Nusbaum, and Harry Reindel of IBM’s Systems and Technology Group for sharing design data and experiences.  We are also grateful to the many other people across IBM who contributed their time and expertise.







Mon, 18 Jul 2022 12:00:00 -0500 en text/html
Killexams : 7 Best Income Stocks to Buy Now

While rational traders participate in the equities market to see solid returns on their investments, the present paradigm encourages everyone to consider the best income stocks to buy now. Of course, no one is going to complain about robust capital gains; That is, until tax season. But the inflationary crisis we’re in drives more emphasis on passive income than ever before.

As of this writing, the annual inflation rate for the U.S. is 8.6% for the 12 months ended May 2022, the largest annual increase since December 1981. Naturally, consumers mostly feel the heat when they pump gasoline into their cars or buy groceries for their family. To mitigate this sticker shock, the best income stocks to buy may help.

Another factor to consider during this period is inflation’s impact on real earnings. Because prices of goods and core utilities are rising, you’re basically receiving a pay cut or hidden tax. Obviously, such a circumstance can be incredibly frustrating, though it cynically adds to the bullish case for the best income stocks to buy now.

Ticker Company Price
CVX Chevron $144.61
ABBV AbbVie $149.74
IBM IBM $130.88
ENB Enbridge $43.15
ADC Agree Realty $76.26
LTC LTC Properties $39.74
SCCO Southern Copper $49.08

Income Stocks to Buy: Chevron (CVX)

Though hydrocarbon-related companies have been unpopular for a very long time, the dirty little secret is that they’re relevant and necessary. As I’ve mentioned several times before, fossil fuels are difficult to quit because of their high energy density. Essentially, for just a gallon of gas, you can move an SUV down the freeway for 20 or 30 miles.

You’re just not going to find that kind of density from electric vehicles, which is one of the most powerful (albeit cynical) arguments bolstering Chevron (NYSE:CVX). One of the big oil giants, Chevron will likely never court the public’s sympathies. Setting that issue aside, though, the company is extraordinarily relevant, especially because Russia’s reckless war in Ukraine has effectively shelved much of the world’s energy supplies.

As The Economist pointed out recently, international policymakers have warned that the Ukraine crisis could last for years. Such a scenario presents myriad questions about societal and economic stability. However, it’s unavoidable that it keeps the lights on at Chevron and then some, making it one of the most effective among the best income stocks to buy.

AbbVie (ABBV)

While the vast majority of the global public is ready to put the coronavirus nightmare behind it, reported on a concerning new phenomenon. While Covid-19 vaccines are due for an upgrade, “emerging variants and fickle immune reactions mean it’s not clear what new jabs should look like.”

Nevertheless, after two years of lockdowns and various mitigation mandates, the fear of Covid-19 has been fading. Instead, concepts like retail revenge or revenge travel have taken over public sentiment, which suits AbbVie (NYSE:ABBV) just fine. As a pharmaceutical giant that now owns the Botox neurotoxin, AbbVie is an underappreciated investment for the return to normal.

Back during the worst of the pandemic, the nation experienced what a Washington Post op-ed referred to as our pajama moment. Those days are now gone, with powerful voices in business demanding that their workers return to the office. In other words, the emphasis is now back to looking good, which may help lift Botox sales.

In turn, ABBV stock is one of the best income stocks to buy now — featuring a forward yield of 3.7%.

Income Stocks to Buy: IBM (IBM)

Amid the tight competition in the technology sphere, IBM (NYSE:IBM) often times gets overlooked. It’s not necessarily fair considering that the company has been making significant inroads with cloud computing, cybersecurity, artificial intelligence and other groundbreaking innovations. Still, it’s tough to shed a less-than-favorable reputation.

However, IBM is so far getting the last laugh. On a year-to-date (or YTD) basis, shares are down 2%, which isn’t exactly riveting stuff. But when stacked up against popular tech plays — many of which are hemorrhaging sizable double-digit figures — IBM might as well be shooting to the moon. Indeed, since December of last year, Big Blue has been quietly making a comeback.

Long-term investors may want to consider IBM simply on the basis that it has its hands in several relevant technologies. Adding in its passive income potential is a sweet bonus, particularly with its forward yield of 5.1%. Sometimes, slow and steady wins the race for the best income stocks to buy.

Enbridge (ENB)

While myriad oil and natural gas companies may qualify for the best income stocks to buy now, one of the challenges for companies tied to the upstream business model — or the exploration and initial production of fossil fuels — is that energy pricing can be volatile. For a little bit more stability, you may want to consider midstream operators like Enbridge (NYSE:ENB).

Midstream firms specialize in activities such as processing, storage, transportation and marketing of hydrocarbon products. The beauty about Enbridge is that the company owns and operates the largest network of oil and gas pipelines in North America, making it an ingrained component of the transportation sector and more broadly, national security.

What really makes ENB stock stand out as one of the best income stocks to buy is its generous payout. Featuring a forward yield of 6.2%, Enbridge can help cushion some of the shock associated with inflation. In addition, data on vehicle miles traveled suggests that the company has significant upside ahead.

Income Stocks to Buy: Agree Realty (ADC)

Among the largest real estate investment trusts (REITs), Agree Realty (NYSE:ADC) is particularly attractive for its relevance. While the consumer economy is undoubtedly hurting from the soaring inflation rate, Agree Realty invests in properties net leased to some of the biggest names in commerce such as Walmart (NYSE:WMT) and Home Depot (NYSE:HD).

Put another way, while many analysts expect a recession of some sort, few are calling for a devastating depression that would result in unprecedented cuts to spending. The likely scenario is that consumers will focus more on essential goods, which should benefit Agree Realty.

Another factor that bolsters the case of ADC stock being one of the best income stocks to buy now is that it distributes passive income on a monthly basis. As you know, the frequency of life — mortgage/rent payments, internet service contracts, utility bills — is monthly. Therefore, Agree Realty helps you get the funds you need when you need them the most.

LTC Properties (LTC)

If you’re seeking a diversified portfolio of the best income stocks to buy, LTC Properties (NYSE:LTC) is well worth consideration. For one thing, this REIT also offers monthly payouts, enabling you to align your passive income with the bills that you pay. Furthermore, this payout frequency enables faster compounding, providing a critical tool to combat inflation.

Beyond this administrative point, though, LTC Properties is attractive for its core business. The REIT specializes in senior housing and healthcare, primarily through sale-leasebacks, mortgage financing, joint-ventures, construction financing and structured finance solutions. LTC’s portfolio is roughly divided in half between senior housing and skilled nursing properties.

As myriad publications have mentioned, baby boomers are retiring in large numbers, with this pace of retirement accelerating. The unique factors of the Covid-19 crisis have also led to workers older than 55 representing the majority of participants of the Great Resignation.

Basically, over the next several years, demand for senior care should rise exponentially. Therefore, LTC stock appears a solid long-term bet among the best income stocks to buy.

Income Stocks to Buy: Southern Copper (SCCO)

With the rise of meme stocks and cryptocurrencies, it’s apparent that quite a few people have the speculation bug in their bones. Well, I’m the type of person that likes to deliver the audience what they want. So, if you want to dial up the risk-reward factor for the best income stocks to buy now, you may want to have a look at Southern Copper (NYSE:SCCO).

To be clear, copper prices are slipping badly, largely on global recession fears. With inflation reducing real earnings, consumers are naturally going to reduce their expenditures, first avoiding the discretionary purchases and later much of the lower-priority essentials. Eventually, such cuts are going to impact copper demand, which isn’t great for SCCO stock.

At the same time, copper is critical for the industries and technologies of tomorrow, most notably electric vehicles (or EVs). Moreover, with the electrification of transportation being a vital component of the broader strategy to reduce foreign oil dependencies, SCCO stock might be worth consideration.

Oh yeah, the company features a forward yield of 10%.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the Publishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare.

Tue, 19 Jul 2022 23:29:00 -0500 en-US text/html
Killexams : Biden wants an industrial renaissance. He can’t do it without immigration reform.

JOHNSTOWN, Ohio — Just 15 minutes outside of downtown Columbus, the suburbs abruptly evaporate. Past a bizarre mix of soybean fields, sprawling office parks and lonely clapboard churches is a field where the Biden administration — with help from one of the world’s largest tech companies — hopes to turn the U.S. into a hub of microchip manufacturing.

In his State of the Union address in March, President Joe Biden called this 1,000-acre spread of corn stalks and farmhouses a “field of dreams.” Within three years, it will house two Intel-operated chip facilities together worth $20 billion — and Intel is promising to invest $80 billion more now that Washington has sweetened the deal with subsidies. It’s all part of a nationwide effort to head off another microchip shortage, shore up the free world’s advanced industrial base in the face of a rising China and claw back thousands of high-end manufacturing jobs from Asia.

But even as Biden signs into law more than $52 billion in “incentives” designed to lure chipmakers to the U.S., an unusual alliance of industry lobbyists, hard-core China hawks and science advocates says the president’s dream lacks a key ingredient — a small yet critical core of high-skilled workers. It’s a politically troubling irony: To achieve the long-sought goal of returning high-end manufacturing to the United States, the country must, paradoxically, attract more foreign workers.

“For high-tech industry in general — which of course, includes the chip industry — the workforce is a huge problem,” said Julia Phillips, a member of the National Science Board. “It’s almost a perfect storm.”

From electrical engineering to computer science, the U.S. currently does not produce enough doctorates and master’s degrees in the science, technology, engineering and math fields who can go on to work in U.S.-based microchip plants. Decades of declining investments in STEM education means the U.S. now produces fewer native-born recipients of advanced STEM degrees than most of its international rivals.

Foreign nationals, including many educated in the U.S., have traditionally filled that gap. But a bewildering and anachronistic immigration system, historic backlogs in visa processing and rising anti-immigrant sentiment have combined to choke off the flow of foreign STEM talent precisely when a fresh surge is needed.

Powerful members of both parties have diagnosed the problem and floated potential fixes. But they have so far been stymied by the politics of immigration, where a handful of lawmakers stand in the way of reforms few are willing to risk their careers to achieve. With a short window to attract global chip companies already starting to close, a growing chorus is warning Congress they’re running out of time.

“These semiconductor investments won’t pay off if Congress doesn’t fix the talent bottleneck,” said Jeremy Neufeld, a senior immigration fellow at the Institute for Progress think tank.

Given the hot-button nature of immigration fights, the chip industry has typically been hesitant to advocate directly for reform. But as they pump billions of dollars into U.S. projects and contemplate far more expensive plans, a sense of urgency is starting to outweigh that reluctance.

“We are seeing greater and greater numbers of our employees waiting longer and longer for green cards,” said David Shahoulian, Intel’s head of workforce policy. “At some point it will become even more difficult to attract and retain folks. That will be a problem for us; it will be a problem for the rest of the tech industry.”

“At some point, you’ll just see more offshoring of these types of positions,” Shahoulian said.

A Booming Technology

Microchips (often called “semiconductors” by wonkier types) aren’t anything new. Since the 1960s, scientists — working first for the U.S. government and later for private industry — have tacked transistors onto wafers of silicon or other semiconducting materials to produce computer circuits. What has changed is the power and ubiquity of these chips.

The number of transistors researchers can fit on a chip roughly doubles every two years, a phenomenon known as Moore’s Law. In exact years, that has led to absurdly powerful chips bristling with transistors — IBM’s latest chip packs them at two-nanometer intervals into a space roughly the size of a fingernail. Two nanometers is thinner than a strand of human DNA, or about how long a fingernail grows in two seconds.

A rapid boost in processing power stuffed into ever-smaller packages led to the information technology boom of the 1990s. And things have only accelerated since — microchips remain the primary driver of advances in smartphones and missiles, but they’re also increasingly integrated into household appliances like toaster ovens, thermostats and toilets. Even the most inexpensive cars on the market now contain hundreds of microchips, and electric or luxury vehicles are loaded with thousands.

It all adds up to a commodity widely viewed as the bedrock of the new digital economy. Like fossil fuels before them, any country that controls the production of chips possesses key advantages on the global stage.

Until fairly recently, the U.S. was one of those countries. But while chips are still largely designed in America, its capacity to produce them has declined precipitously. Only 12 percent of the world’s microchip production takes place in the U.S., down from 37 percent in 1990. That percentage declines further when you exclude “legacy” chips with wider spaces between transistors — the vast majority of bleeding-edge chips are manufactured in Taiwan, and most factories not found on that island reside in Asian nations like South Korea, China and Japan.

For a long time, few in Washington panic about America’s flagging chip production. Manufacturing in the U.S. is expensive, and offshoring production to Asia while keeping R&D stateside was a good way to cut costs.

Two things changed that calculus: the Covid-19 pandemic and rising tensions between the U.S. and China.

Abrupt work stoppages sparked by viral spread in Asia sent shockwaves through finely tuned global supply chains. The flow of microchips ceased almost overnight, and then struggled to restart under new Covid surges and ill-timed extreme weather events. Combined with a spike in demand for microelectronics (sparked by generous government payouts to citizens stuck at home), the manufacturing stutter kicked off a chip shortage from which the world is still recovering.

Even before the pandemic, growing animosity between Washington and Beijing caused officials to question the wisdom of ceding chip production to Asia. China’s increasingly bellicose threats against Taiwan caused some to conjure up nightmare scenarios of an invasion or blockade that would sever the West from its supply of chips. The Chinese government was also pouring billions of dollars into a crash program to boost its own lackluster chip industry, prompting fears that America’s top foreign adversary could one day corner the market.

By 2020 the wheels had begun to turn on Capitol Hill. In January 2021, lawmakers passed as part of their annual defense bill the CHIPS for America Act, legislation authorizing federal payouts for chip manufacturers. But they then struggled to finance those subsidies. Although they quickly settled on more than $52 billion for chip manufacturing and research, lawmakers had trouble decoupling those sweeteners from sprawling anti-China “competitiveness” bills that stalled for over a year.

But those subsidies, as well as new tax credits for the chip industry, were finally sent to Biden’s desk in late July. Intel isn’t the only company that’s promised to supercharge U.S. projects once that money comes through — Samsung, for example, is suggesting it will expand its new $17 billion chip plant outside of Austin, Texas, to a nearly $200 billion investment. Lawmakers are already touting the subsidies as a key step toward an American renaissance in high-tech manufacturing.

Quietly, however, many of those same lawmakers — along with industry lobbyists and national security experts — fear all the chip subsidies in the world will fall flat without enough high-skilled STEM workers. And they accuse Congress of failing to seize multiple opportunities to address the problem.

STEM help wanted

In Columbus, just miles from the Johnstown field where Intel is breaking ground, most officials don’t mince words: The tech workers needed to staff two microchip factories, let alone eight, don’t exist in the region at the levels needed.

“We’re going to need a STEM workforce,” admitted Jon Husted, Ohio’s Republican lieutenant governor.

But Husted and others say they’re optimistic the network of higher ed institutions spread across Columbus — including Ohio State University and Columbus State Community College — can beef up the region’s workforce fast.

“I feel like we’re built for this,” said David Harrison, president of Columbus State Community College. He highlighted the repeated refrain from Intel officials that 70 percent of the 3,000 jobs needed to fill the first two factories will be “technician-level” jobs requiring two-year associate degrees. “These are our jobs,” Harrison said.

Harrison is anxious, however, over how quickly he and other leaders in higher ed are expected to convince thousands of students to sign up for the required STEM courses and join Intel after graduation. The first two factories are slated to be fully operational within three years, and will need significant numbers of workers well before then. He said his university still lacks the requisite infrastructure for instruction on chip manufacturing — “we’re missing some wafer processing, clean rooms, those kinds of things” — and explained that funding recently provided by Intel and the National Science Foundation won’t be enough. Columbus State will need more support from Washington.

“I don’t know that there’s a great Plan B right now,” said Harrison, adding that the new facilities will run into “the tens of millions.”

A lack of native STEM talent isn’t unique to the Columbus area. Across the country, particularly in regions where the chip industry is planning to relocate, officials are fretting over a perceived lack of skilled technicians. In February, the Taiwanese Semiconductor Manufacturing Corporation cited a shortage of skilled workers when announcing a six-month delay in the move-in date for their new plant in Arizona.

“Whether it’s a licensure program, a two-year program or a Ph.D., at all levels, there is a shortfall in high-tech STEM talent,” said Phillips. The NSB member highlighted the “missing millions of people that are not going into STEM fields — that basically are shut out, even beginning in K-12, because they’re not exposed in a way that attracts them to the field.”

Industry groups, like the National Association of Manufacturers, have long argued a two-pronged approach is necessary when it comes to staffing the high-tech sector: Reevaluating immigration policy while also investing heavily in workforce development

The abandoned House and Senate competitiveness bills both included provisions that would have enhanced federal support for STEM education and training. Among other things, the House bill would have expanded Pell Grant eligibility to students pursuing career-training programs.

“We have for decades incentivized degree attainment and not necessarily skills attainment,” said Robyn Boerstling, NAM’s vice president of infrastructure, innovation and human resources policy. “There are manufacturing jobs today that could be filled with six weeks of training, or six months, or six years; we need all of the above.”

But those provisions were scrapped, after Senate leadership decided a conference between the two chambers on the bills was too unwieldy to reach agreement before the August recess.

Katie Spiker, managing director of government affairs at National Skills Coalition, said the abandoned Pell Grant expansion shows Congress “has not responded to worker needs in the way that we need them to.” Amid criticisms that the existing workforce development system is unwieldy and ineffective, the decision to scrap new upgrades is a continuation of a trend of disinvesting in workers who hope to obtain the skills they need to meet employer demand.

“And it becomes an issue that only compounds itself over time,” Spiker said. “As technology changes, people need to change and evolve their skills.”

“If we’re not getting people skilled up now, then we won’t have people that are going to be able to evolve and skill up into the next generation of manufacturing that we’ll do five years from now.”

Congress finally sent the smaller Chips and Science Act — which includes the chip subsidies and tax credits, $200 million to develop a microchip workforce and a slate of R&D provisions — to the president’s desk in late July. The bill is expected to enhance the domestic STEM pool (at least on the margins). But it likely falls short of the generational investments many believe are needed.

“You could make some dent in it in six years,” said Phillips. “But if you really want to solve the problem, it’s closer to a 20-year investment. And the ability of this country to invest in anything for 20 years is not phenomenal.”

Immigration Arms Race

The microchip industry is in the midst of a global reshuffling that’s expected to last a better part of the decade — and the U.S. isn’t the only country rolling out the red carpet. Europe, Canada, Japan and other regions are also panic about their security, and preparing sweeteners for microchip firms to set up shop in their borders. Cobbling together an effective STEM workforce in a short time frame will be key to persuading companies to choose America instead.

That will be challenging at the technician level, which represents around 70 percent of workers in most microchip factories. But those jobs require only two-year degrees — and over a six-year period, it’s possible a sustained education and recruitment effort can produce enough STEM workers to at least keep the lights on.

It’s a different story entirely for Ph.D.s and master’s degrees, which take much longer to earn and which industry reps say make up a smaller but crucial component of a factory’s workforce.

Gabriela González, Intel’s head of global STEM research, policy and initiatives, said about 15 percent of factory workers must have doctorates or master’s degrees in fields such as material and electrical engineering, computer science, physics and chemistry. Students coming out of American universities with those degrees are largely foreign nationals — and increasingly, they’re graduating without an immigration status that lets them work in the U.S., and with no clear pathway to achieving that status.

A National Science Board estimate from earlier this year shows a steadily rising proportion of foreign-born students with advanced STEM skills. That’s especially true for degrees crucial to the chip industry — nearly 60 percent of computer science Ph.D.s are foreign born, as are more than 50 percent of engineering doctorates.

“We are absolutely reliant on being able to hire foreign nationals to fill those needs,” said Intel’s Shahoulian. Like many in the chip industry, Shaoulian contends there simply aren’t enough high-skilled STEM professionals with legal status to simultaneously serve America’s existing tech giants and an influx of microchip firms.

Some academics, such as Howard University’s Ron Hira, suggest the shortage of workers with STEM degrees is overblown, and industry simply seeks to import cheaper, foreign-born labor. But that view contrasts with those held by policymakers on Capitol Hill or people in the scientific and research communities. In a report published in late July by the Government Accountability Office, all 17 of the experts surveyed agreed the lack of a high-skilled STEM workforce was a barrier to new microchip projects in the U.S. — and most said some type of immigration reform would be needed.

Many, if not most, of the foreign nationals earning advanced STEM degrees from U.S. universities would prefer to stay and work in the country. But America’s immigration system is turning away these workers in record numbers — and at the worst possible time.

Ravi (not his real name, given his tenuous immigration status) is an Indian national. Nearly three years ago, he graduated from a STEM master’s program at a prestigious eastern university before moving to California to work as a design verification lead at an international chip company. He’s applied three times for an H-1B visa, a high-skilled immigration program used extensively by U.S. tech companies. But those visas are apportioned via a lottery, and Ravi lost each time. His current visa only allows him to work through the end of year — so Ravi is giving up and moving to Canada, where he’s agreed to take a job with another chip company. Given his skill set, he expects to quickly receive permanent legal status.

“The application process is incredibly simple there,” said Ravi, noting that Canadian officials were apologetic over their brief 12-week processing time (they’re swamped by refugee applications, he said).

If given the choice, Ravi said he would’ve probably stayed in California. But his story now serves as a cautionary tale for his younger brother back home. “Once he sort of completed his undergrad back in India, he did mention that he is looking at more immigration-friendly countries,” Ravi said. “He’s giving Canada more thought, at this point, than the United States.”

Ravi’s story is far from unique, particularly for Indian nationals. The U.S. imposes annual per-country caps on green cards — and between a yearly crush of applicants and a persistent processing backlog, Indians (regardless of their education or skill level) can expect to wait as long as 80 years for permanent legal status. A report released earlier this year by the libertarian Cato Institute found more than 1.4 million skilled immigrants are now stuck in green card backlogs, just a slight drop from 2020’s all-time high of more than 1.5 million.

The third rail of U.S. politics

The chip industry has shared its anxiety over America’s slipping STEM workforce with Washington, repeatedly asking Congress to make it easier for high-skilled talent to stay. But unlike their lobbying for subsidies and tax breaks — which has gotten downright pushy at times — they’ve done so very quietly. While chip lobbyists have spent months telling anyone who will listen why the $52 billion in financial incentives are a “strategic imperative,” they’ve only recently been willing to discuss their immigration concerns on the record.

In late July, nine major chip companies planned to send an open letter to congressional leadership warning that the shortage of high-skilled STEM workers “has truly never been more acute” and urging lawmakers to “enact much-needed green card reforms.” But the letter was pulled at the last minute, after some companies panic about wading into a tense immigration debate at the wrong time.

Leaders in the national security community have been less shy. In May, more than four dozen former officials sent a leader to congressional leadership urging them to shore up America’s slipping immigration edge before Chinese technology leapfrogs ours. “With the world’s best STEM talent on its side, it will be very hard for America to lose,” they wrote. “Without it, it will be very hard for America to win.”

The former officials exhorted lawmakers to take up and pass provisions in the House competitiveness bill that would’ve lifted green card caps for foreign nationals with STEM Ph.D.s or master’s degrees. It’d be a relatively small number of people — a February study from Georgetown University’s Center for Security and Emerging Technology suggested the chip industry would only need around 3,500 foreign-born workers to effectively staff new U.S.-based factories.

“This is such a small pool of people that there’s already an artificial cap on it,” said Klon Kitchen, a senior fellow focused on technology and national security at the conservative American Enterprise Institute.

Kitchen suggested the Republican Party’s wariness toward immigration shouldn’t apply to these high-skilled workers, and some elected Republicans agree. Sen. John Cornyn, whose state of Texas is poised to gain from the expansion of chip plants outside Austin, took up the torch — and almost immediately got burned.

Sen. Chuck Grassley, Iowa’s senior Republican senator, blocked repeated attempts by Cornyn, Democrats and others to include the green card provision in the final competitiveness package. Finding relief for a small slice of the immigrant community, Grassley reasoned, “weakens the possibility to get comprehensive immigration reform down the road.” He refused to budge even after Biden administration officials warned him of the national security consequences in a classified June 16 briefing, which was convened specifically for him. The effort has been left for dead (though a push to shoehorn a related provision into the year-end defense bill is ongoing).

Many of Grassley’s erstwhile allies are frustrated with his approach. “We’ve been talking about comprehensive immigration reform for how many decades?” asked Kitchen, who said he’s “not inclined” to let America’s security concerns “tread water in the background” while Congress does nothing to advance broader immigration bills.

Most Republicans in Congress agree with Kitchen. But so far it’s Cornyn, not Grassley, who’s paid a price. After helping broker a deal on gun control legislation in June, Cornyn was attacked by Breitbart and others on his party’s right flank for telling a Democratic colleague immigration would be next.

“Immigration is one of the most contentious issues here in Congress, and we’ve shown ourselves completely incapable of dealing with it on a rational basis,” Cornyn said in July. The senator said he’d largely given up on persuading Grassley to abandon his opposition to new STEM immigration provisions. “I would love to have a conversation about merit-based immigration,” Cornyn said. “But I don’t think, under the current circumstances, that’s possible.”

Cornyn blamed that in part on the far right’s reflexive outrage to any easing of immigration restrictions. “Just about anything you say or do will get you in trouble around here these days,” he said.

Given that reality, few Republicans are willing to stick their necks out on the issue.

“If you look at the messaging coming out of [the National Republican Senatorial Committee] or [the Republican Attorneys General Association], it’s all ‘border, border, border,’” said Rebecca Shi, executive director of the American Business Immigration Coalition. Shi said even moderate Republicans hesitate to publicly advance arguments “championing these sensible visas for Ph.D. STEM talents for integrated circuits for semiconductors.”

“They’re like … ‘I can’t say those phrases until after the elections,’” Shi said.

That skittishness extends to state-level officials — Ohio’s Husted spent some time expounding on the benefits of “bringing talented people here to do the work in America, rather than having companies leave America to have it done somewhere else.” He suggested that boosting STEM immigration would be key to Intel’s success in his state. But when asked whether he’s taken that message to Ohio’s congressional delegation — after all, he said he’d been pestering them to pass the chip subsidies — Husted hedged.

“My job is to do all I can for the people of the state of Ohio. There are other people whose job it is to message those other things,” Husted said. “But if asked, you heard what my answer is.”

Of course, Republicans also pin some of the blame on Democrats. “The administration ignores the fire at the border and the chaos there, which makes it very hard to have a conversation about controlling immigration flows,” Cornyn said.

And while Democratic lawmakers reject that specific concern, some admit their side hasn’t prioritized STEM immigration as it should.

“Neither team has completely clean hands,” said Sen. Mark Warner, the chair of the Senate Intelligence Committee. Warner noted that Democrats have also sought to hold back STEM immigration fixes as “part of a sweetener” so that business-friendly Republicans would in turn back pathways to citizenship for undocumented immigrants. He also dinged the chip companies, claiming the issue is “not always as straightforward” as the industry would like to frame it and that tech companies sometimes hope to pay less for foreign-born talent.

But Warner still supports the effort to lift green card caps for STEM workers. “Without that high-skilled immigration, it’s not like those jobs are going to disappear,” he said. “They’re just gonna move to another country.”

And despite their rhetoric, it’s hard to deny that congressional Republicans are largely responsible for continued inaction on high-skilled immigration — even as their allies in the national security space become increasingly insistent.

Stuck on STEM immigration

Though they’ve had to shrink their ambitions, lawmakers working to lift green card caps for STEM immigrants haven’t given up. A jurisdictional squabble between committees in July prevented advocates from including in the House’s year-end defense bill a provision that would’ve nixed the caps for Ph.D.s in “critical” STEM fields. They’re now hoping to shoehorn the provision into the Senate’s defense bill instead, and have tapped Republican Sen. Thom Tillis of North Carolina as their champion in the upper chamber.

But Tillis is already facing pushback from the right. And despite widespread support, few truly believe there’s enough momentum to overcome Grassley and a handful of other lawmakers willing to block any action.

“Most members on both sides recognize that this is a problem they need to resolve,” said Intel’s Shahoulian. “They’re just not at a point yet where they’re willing to compromise and take the political hits that come with it.”

The global chip industry is moving in the meantime. While most companies are still planning to set up shop in the U.S. regardless of what happens with STEM immigration, Shahoulian said inaction on that front will inevitably limit the scale of investments by Intel and other firms.

“You’re already seeing that dynamic playing out,” he said. “You’re seeing companies set up offices in Canada, set up offices elsewhere, move R&D work elsewhere in the world, because it is easier to retain talent elsewhere than it is here.”

“This is an issue that will progressively get worse,” Shahoulian said. “It’s not like there will be some drop-dead deadline. But yeah, it’s getting difficult.”

Intel is still plowing ahead in Johnstown — backhoes are churning up dirt, farmers have been bought out of homes owned by their families for generations and the extensive water and electric infrastructure required for eight chip factories is being laid. Whether those bets will pay off in the long-term may rest on Congress’ ability to thread the needle on STEM immigration. And there’s little optimism at the moment.

Sen. Maria Cantwell, the chair of the Senate Commerce Committee, said she sometimes wishes she could “shake everybody and tell them to wake up.” But she believes economic and geopolitical realities will force Congress to open the door to high-skilled foreign workers — eventually.

“I think the question is whether you do that now or in 10 years,” Cantwell said. “And you’ll be damn sorry if you wait for 10 years.”

Sat, 30 Jul 2022 23:00:00 -0500 en text/html Killexams : Monarch Casino: Best Gaming Stock Bet, Say Portfolio Wealth Builders
Business on Wall Street in Manhattan

Pgiam/iStock via Getty Images

The primary focus of this article is Monarch Casino & Resort, Inc. (NASDAQ:MCRI)

Investment Thesis

21st Century paces of change in technology and rational behavior (not of emotional reactions) seriously disrupts the commonly accepted productive investment strategy of the 20th century.

One required change is the shortening of forecast horizons, with a shift from the multi-year passive approach of buy and hold to the active strategy of specific price-change target achievement or time-limit actions, with reinvestment set to new nearer-term targets.

That change avoids the irretrievable loss of invested time spent destructively by failure to recognize shifting evolutions like the cases of IBM, Kodak, GM, Xerox, General Electric, and many others.

It recognizes the progress in medical, communication and information technologies and enjoys their operational benefits already present in extended lifetimes, trade-commission-free investments, and coming benefits in transportation utilizations and energy usage.

But it requires the ability to make valid direct comparisons of value between investment reward prospects and risk exposures in the uncertain future. Since uncertainty expands as the future dimension increases, shorter forecast horizons are a means of improving the reward-to-risk comparison.

That shortening is now best attended at the investment entry point by knowing Market-Maker ("MM") expectations for coming prices. When reached, their updates are then reintroduced at the exit/reinvestment point and the term of expectations for the required coming comparisons are recognized as the decision entry point to move forward.

The MM's constant presence, extensive global communications and human resources dedicated to monitoring industry-focused competitive evolution sharpens MM price expectations, essential to their risk-avoidance roles.

Their roles require firm capital be only temporarily risk-exposed, so are hedged by derivative-securities deals to avoid undesired price changes. The deals' prices and contracts provide a window to MM price expectations.

Information technology via the internet makes investment monitoring and management time and attention efficient despite its increase in frequency.

Once an investment choice is made and buy transaction confirmation is received, a target-price GTC sell order for the confirmed number of shares at the target price or better should be placed. Keeping trade actions entered through the internet on your lap/desk-top or cell phone should avoid trade commission charges. Your broker's internal system should keep you informed of your account's progress.

Your own private calendar record should be kept of the date 63 market days (or 91 calendar days) beyond the trade's confirmation date as a time-limit alert to check if the GTC order has not been executed. If not, then start your exit and reinvestment decision process.

The 3-months' time limit is what we find to be a good choice, but may be extended some if desired. Beyond 5-6 months' time investments start to work against the process and are not recommended.

For investments guided by this article or others by me target prices will always be found as the high price in the MM forecast range.

Description of Equity Subject Company

"Monarch Casino & Resort, Inc., through its subsidiaries, owns and operates the Atlantis Casino Resort Spa, a hotel and casino in Reno, Nevada. The company also owns and operates the Monarch Casino Resort Spa Black Hawk in Black Hawk, Colorado. As of December 31, 2021, its Atlantis Casino Resort Spa featured approximately 61,000 square feet of casino space; 818 guest rooms and suites; 8 food outlets; 2 gourmet coffee and pastry bars; a 30,000 square-foot health spa and salon with an enclosed pool; approximately 52,000 square feet of banquet, convention, and meeting room space. The company's Atlantis Casino Resort Spa also featured approximately 1,400 slot and video poker machines; approximately 37 table games, including blackjack, craps, roulette, and others; a race and sports book; a 24-hour live keno lounge; and a poker room. In addition, its Monarch Casino Resort Spa Black Hawk featured approximately 60,000 square feet of casino space; approximately 1,100 slot machines; approximately 40 table games; 10 bars and lounges; 4 dining options; 516 guest rooms and suites. The company was founded in 1972 and is based in Reno, Nevada."

Source: Yahoo Finance

Estimates by Street Amalysts

Yahoo Finance

These growth estimates have been made by and are collected from Wall Street analysts to suggest what conventional methodology currently produces. The typical variations across forecast horizons of different time periods illustrate the difficulty of making value comparisons when the forecast horizon is not clearly defined.

Risk and Reward Balances Among MCRI Competitors

Figure 1

MM hedging forecasts

Used with permission.

The risk dimension is of real price draw-downs at their most extreme point while being held in previous pursuit of upside rewards similar to the ones currently being seen. They are measured on the red vertical scale. Reward expectations are measured on the green horizontal scale.

Both scales are of percent change from zero to 25%. Any stock or ETF whose present risk exposure exceeds its reward prospect will be above the dotted diagonal line. Capital-gain-attractive to-buy issues are in the directions down and to the right.

Our principal interest is in MCRI at location [11], at the lower right-hand edge of the competitor crowd. A "market index" norm of reward~risk trade-offs is offered by SPY at [7]. Most appealing by this Figure 1 view for wealth-building investors is MCRI.

Comparing competitive features of Casino Gaming Providers

The Figure 1 map provides a good visual comparison of the two most important aspects of every equity investment in the short term. There are other aspects of comparison which this map sometimes does not communicate well, particularly when general market perspectives like those of SPY are involved. Where questions of "how likely' are present other comparative tables, like Figure 2, may be useful.

Yellow highlighting of the table's cells emphasize factors important to securities valuations and the security MCRI of most promising of near capital gain as ranked in column [R].

Figure 2

detail comparative data

Used with permission.

Why do all this math?

Figure 2's purpose is to attempt universally comparable answers, stock by stock, of: a) How BIG the prospective price gain payoff may be; b) how LIKELY the payoff will be a profitable experience; c) how SOON it may happen; and d) what price drawdown RISK may be encountered during its active holding period.

Readers familiar with our analysis methods after quick examination of Figure 2 may wish to skip to the next section viewing price range forecast trends for MCRI.

Column headers for Figure 2 define investment-choice preference elements for each row stock whose symbol appears at the left in column [A]. The elements are derived or calculated separately for each stock, based on the specifics of its situation and current-day MM price-range forecasts. Data in red numerals are negative, usually undesirable to "long" holding positions. Table cells with yellow fills are of data for the stocks of principal interest and of all issues at the ranking column, [R].

The price-range forecast limits of columns [B] and [C] get defined by MM hedging actions to protect firm capital required to be put at risk of price changes from volume trade orders placed by big-$ "institutional" clients.

[E] measures potential upside risks for MM short positions created to fill such orders, and reward potentials for the buy-side positions so created. Prior forecasts like the present provide a history of relevant price draw-down risks for buyers. The most severe ones actually encountered are in [F], during holding periods in effort to reach [E] gains. Those are where buyers are emotionally most likely to accept losses.

The Range Index [G] tells where today's price lies relative to the MM community's forecast of upper and lower limits of coming prices. Its numeric is the percentage proportion of the full low to high forecast seen below the current market price.

[H] tells what proportion of the [L] trial of prior like-balance forecasts have earned gains by either having price reach its [B] target or be above its [D] entry cost at the end of a 3-month max-patience holding period limit. [ I ] gives the net gains-losses of those [L] experiences.

What makes MCRI most attractive in the group at this point in time is its ability to produce capital gains most consistently at its present operating balance between share price risk and reward at the Range Index [G]. At a RI of 12, today's price is near the bottom of its forecast range, with price expectations to the upside seven times those to the downside. Not our expectations, nut those of Market-Makers acting in support of Institutional Investment organizations build the values of their typical multi-billion-$ portfolios. Credibility of the [E] upside prospect as evidenced in the [I] payoff at +18% is shown in [N].

Further Reward~Risk trade-offs involve using the [H] odds for gains with the 100 - H loss odds as weights for N-conditioned [E] and for [F], for a combined-return score [Q]. The typical position holding period [J] on [Q] provides a figure of merit [fom] ranking measure [R] useful in portfolio position preferences. Figure 2 is row-ranked on [R] among alternative candidate securities, with MCRI in top rank.

Along with the candidate-specific stocks these selection considerations are provided for the averages of some 3,000 stocks for which MM price-range forecasts are available today, and 20 of the best-ranked (by fom) of those forecasts, as well as the forecast for S&P500 Index ETF (SPY) as an equity-market proxy.

Current-market index SPY is only moderately competitive as an investment alternative. Its Range Index of 42 indicates half of its forecast range is to the upside, while three quarters of previous SPY forecasts at this range index produced profitable outcomes.

As shown in column [T] of figure 2, those levels vary significantly between stocks. What matters is the net gain between investment gains and losses actually achieved following the forecasts, shown in column [I]. The Win Odds of [H] tells what proportion of the trial RIs of each stock were profitable. Odds below 80% often have proven to lack reliability.

Recent Forecast Trends of the Primary Subject

Figure 3

daily forecasst trends

Used with permission.

Many investors confuse any time-repeating picture of stock prices with typical "technical analysis charts" of past stock price history. These are quite different in their content. Instead, here Figure 3's vertical lines are a daily-updated visual record of price range forecast limits expected in the coming few weeks and months. The heavy dot in each vertical is the stock's closing price on the day the forecast was made.

That market price point makes an explicit definition of the price reward and risk exposure expectations which were held by market participants at the time, with a visual display of their vertical balance between risk and reward.

The measure of that balance is the Range Index (RI).

With today's RI there is 14.8% upside price change in prospect. Of the prior 27 forecasts like today's RI, 25 have been profitable. The market's actions of prior forecasts became accomplishments of +15% gains in 30 market days., or 6 weeks. So history's advantage could be repeated eight times or more in a 252 market-day year, which compounds into a CAGR of +232%.

Also please note the smaller low picture in Figure 3. It shows the past 5-year distribution of Range Indexes with the current level visually marked. For MCRI nearly all exact past forecasts have been of higher prices and Range Indexes.


Based on direct comparisons with MCRI and other Casino Gambling establishments, there are strong wealth-building reasons to prefer a capital-gain seeking buy in Monarch Casino & Resort, Inc. over other examined alternatives.

Fri, 29 Jul 2022 04:37:00 -0500 en text/html
Killexams : Risk Analytics Market Analysis 2022 With Top Leaders, Size, Share, Growth, Technical Industry Vision Throughout the World till 2027

The MarketWatch News Department was not involved in the creation of this content.

Jun 27, 2022 (The Expresswire) -- "Risk Analytics Market" expected to grow considerably in the forecast period 2022- 2027. COVID-19 can affect the economy in three main ways: by directly affecting production and demand, by creating supply chain and market disruption, and by its financial impact on firms and financial markets. The report offers a dashboard overview of leading companies like: ,..., encompassing their successful marketing strategies, market contribution, exact developments in both historic and present contexts.

The global risk analytics market was valued at USD 9.15 billion in 2017, and is expected to reach a value of USD 19.31 million by 2027 at a CAGR of 13.26%, during the forecast period (2021 - 2027). The market is segmented by type of offering, applications, end-user vertical, and geography. This report focuses on adoption of these solutions for various applications various regions. The study also emphasizes on latest trends, industry activities, and vendor market activities.

Get a trial copy of the report at-

Company Coverage: -

- IBM Corporation

- Oracle Corporation

- Moody's Analytics


- SAS Institute Inc.

- Mu Sigma

- Teradata

- Strategic Thought Group

- Sybase

- Real Time Risk Systems

- Panalytix

- Financial Sciences

Get a trial Copy of the Risk Analytics Market Report 2022

Market Players Competitor Analysis:

Today, risk analytics techniques are enabling organizations and risk managers to measure, quantify, and even predict risk with more certainty than ever before. Analytics is excelling in cracking the complex nature of the businesses and is helpful in establishing a baseline for measuring risk across the organization by drawing together many strands of risk into one consolidated system and giving executive clarity in identifying, viewing, understanding, and managing risk.

As part of digital transformation projects, firms in the market will allocate a huge amount of money to risk management technologies and services. The IT spending focus on risk and operational efficiency is reported to be among the top IT priorities of the businesses. According to Global Association of Risk Professionals, it is estimated that capital markets, banking and insurance sectors are likely to spend USD 96 billion, on risk information technologies and services.

The market is expected to move towards taking a unified approach to risk management, and developing an integrated risk management solutions to enable business units and functions to incorporate risk intelligence into the many actions they take across different business units. Deployment of these solutions over the cloud is expected to provide opportunities to SMEs to take better decisions. However, the cyber risks associated are the major factor restraining the deployment of these solutions over the cloud.

BFSI to Witness Huge Adoption of Risk Analytics Solutions

Banks across the globe are realizing that they need a more rational approach for managing a growing plethora of risks enveloping the banking and financial industries landscape, and have now understood the significance of risk analytics. The objective of risk analytics is to establish an integrated approach and consistent set of processes that reduce the redundant risk and control activities that eliminate duplication in the business units, and cut down costs.

Approximately 73% of the banks are expected to drive the highest amount of investment in risk analytics are data quality and sourcing, systems integration, and modeling.

Risk analytics enables the banks and finance institutions to move away from the “silo” approach to risk management and move towards the “holistic” view of enterprise wide risks. For instance, in Operational Risk Management (ORM) the number of transactions that needed to be monitored is growing at an exponential rate, thus, implying the pressure of current banking infrastructure and enabling the market for risk analytics.

North America to Hold the Largest Share in 2021

North America held the largest market share in 2017, and Asia Pacific region is expected to grow with highest CAGR during the forecast period 2021-2027.

The United States holds the largest share of the market due to the presence of large number of services and software providers, whereas presence of large number of SME’s and increasing technological penetration in Asia Pacific region, especially in countries, like China, India, Vietnam, is expected to fuel the market growth in this region.

Key Developments in the Market

• January 2021 - SAS enabled artificial intelligence (AI) solutions by leveraging machine learning, deep learning, text analytics, forecasting and statistics. Its latest SAS Platform release includes a new offering, SAS Visual Text Analytics, and significant enhancements to SAS Visual Data Mining and Machine Learning. They both take advantage of the new capabilities available in SAS Viya. These solutions are likely to address their portfolios in ERP, CRM and risk management software

The major players include - IBM, ORACLE, SAS, SAP, amongst others.

To Understand How COVID-19 Impact is Covered in This Report. Get trial copy of the report at -

Reasons to Purchase this Report:

● To examine the increased usage of Risk Analytics in Industry that affect the global market scenario ● Analyze various perspectives of the market with the help of Porter’s five forces analysis ● To know the modality and application that are expected to dominate the market ● To know the regions that are expected to witness fastest growth during the forecast period ● Identify the latest developments, market shares, and strategies employed by major market players

Customization of the Report:

● This report can be customized to meet your requirements. Please connect with our representative, who will ensure you get a report that suits your needs.

Enquire before Purchasing this report at-

Regional Analysis: -

- North America

- Asia-Pacific

- Europe

- South America

- Africa

Some Major Points from TOC: -

1. Introduction
1.1 Key Deliverables of the Study
1.2 Study Assumptions
1.3 Market Definition
1.4 Key Findings of the Study
2. Research Approach and Methodology
3. Executive Summary
4. Market Dynamics
4.1 Market Overview
4.2 Factors Driving the Market
4.2.1 Increased Market Risk Due To Economic Instability And Market Competitiveness
4.2.2 Global Regulatory Frameworks And Government Policies
4.3 Factors Restraining the Market
4.3.1 High Installation And Operational Costs
4.3.2 Lack Of Awareness Of Risk Analytics Tools Especially In Sme’s
4.4 Industry Value Chain Analysis
4.5 Industry Attractiveness – Porter's Five Industry Forces Analysis
4.5.1 Bargaining Power of Suppliers
4.5.2 Bargaining Power of Consumers
4.5.3 Threat of New Entrants
4.5.4 Threat of Substitute Products or Services
4.5.5 Competitive Rivalry among Existing Competitors
5. Technology Snapshot
6. Global Risk Analytics Market Segmentation
6.1 Type
6.1.1 Software
6.1.2 Services
6.2 Application
6.2.1 Credit Risk
6.2.2 Market and Liquidity Risk
6.2.3 Operational Risk
6.2.4 Portfolio Risk Management
6.3 End-user Vertical
6.3.1 Banking and Financial Services
6.3.2 Retail
6.3.3 Healthcare
6.3.4 Manufacturing
6.3.5 Others (Government, Energy, Mining)
6.4 By Geography
6.4.1 North America
6.4.2 Europe
6.4.3 Asia-Pacific
6.4.4 Rest of the World
7. Competitive Intelligence – Company Profiles
7.1 IBM Corporation
7.2 Oracle Corporation
7.3 Moody's Analytics
7.4 SAP SE
7.5 SAS Institute Inc.
7.6 Mu Sigma
7.7 Teradata
7.8 Strategic Thought Group
7.9 Sybase
7.10 Real Time Risk Systems
7.11 Panalytix
7.12 Financial Sciences
*list not exhaustive
8. Investment Analysis
9. Future Outlook of Risk Analytics MarketAAAA

Browse complete table of contents at-

About Us: Market is changing rapidly with the ongoing expansion of the industry. Advancement in the technology has provided today's businesses with multifaceted advantages resulting in daily economic shifts. Thus, it is very important for a company to comprehend the patterns of the market movements in order to strategize better. An efficient strategy offers the companies with a head start in planning and an edge over the competitors. Market Reports World is the credible source for gaining the market reports that will provide you with the lead your business needs.

Contact Us:

Market Reports World


US (+1) 424 253 0807
UK (+44) 203 239 8187



Press Release Distributed by The Express Wire

To view the original version on The Express Wire visit Risk Analytics Market Analysis 2022 With Top Leaders, Size, Share, Growth, Technical Industry Vision Throughout the World till 2027


Is there a problem with this press release? Contact the source provider Comtex at You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Sun, 26 Jun 2022 14:54:00 -0500 en-US text/html
M9510-726 exam dump and training guide direct download
Training Exams List