Free M2040-642 PDF and VCE at

Create sure that a person has IBM M2040-642 braindumps of actual questions for the particular ICS Social Software Sales Mastery Test v2 bootcamp before you choose to take the particular real test. All of us give the most up-to-date and valid M2040-642 PDF Questions that will contain M2040-642 real examination questions. We possess collected and produced a database associated with M2040-642 practice test from actual examinations having a specific finish goal to provide you an opportunity to get ready plus pass M2040-642 examination upon the first try. Simply memorize our own M2040-642

Exam Code: M2040-642 Practice test 2022 by team
ICS Social Software Sales Mastery Test v2
IBM Software answers
Killexams : IBM Software answers - BingNews Search results Killexams : IBM Software answers - BingNews Killexams : Can IBM Get Back Into HPC With Power10?

The “Cirrus” Power10 processor from IBM, which we codenamed for Big Blue because it refused to do it publicly and because we understand the value of a synonym here at The Next Platform, shipped last September in the “Denali” Power E1080 big iron NUMA machine. And today, the rest of the Power10-based Power Systems product line is being fleshed out with the launch of entry and midrange machines – many of which are suitable for supporting HPC and AI workloads as well as in-memory databases and other workloads in large enterprises.

The question is, will IBM care about traditional HPC simulation and modeling ever again with the same vigor that it has in past decades? And can Power10 help reinvigorate the HPC and AI business at IBM. We are not sure about the answer to the first question, and got the distinct impression from Ken King, the general manager of the Power Systems business, that HPC proper was not a high priority when we spoke to him back in February about this. But we continue to believe that the Power10 platform has some attributes that make it appealing for data analytics and other workloads that need to be either scaled out across small machines or scaled up across big ones.

Today, we are just going to talk about the five entry Power10 machines, which have one or two processor sockets in a standard 2U or 4U form factor, and then we will follow up with an analysis of the Power E1050, which is a four socket machine that fits into a 4U form factor. And the question we wanted to answer was simple: Can a Power10 processor hold its own against X86 server chips from Intel and AMD when it comes to basic CPU-only floating point computing.

This is an important question because there are plenty of workloads that have not been accelerated by GPUs in the HPC arena, and for these workloads, the Power10 architecture could prove to be very interesting if IBM thought outside of the box a little. This is particularly true when considering the feature called memory inception, which is in effect the ability to build a memory area network across clusters of machines and which we have discussed a little in the past.

We went deep into the architecture of the Power10 chip two years ago when it was presented at the Hot Chip conference, and we are not going to go over that ground again here. Suffice it to say that this chip can hold its own against Intel’s current “Ice Lake” Xeon SPs, launched in April 2021, and AMD’s current “Milan” Epyc 7003s, launched in March 2021. And this makes sense because the original plan was to have a Power10 chip in the field with 24 fat cores and 48 skinny ones, using dual-chip modules, using 10 nanometer processes from IBM’s former foundry partner, Globalfoundries, sometime in 2021, three years after the Power9 chip launched in 2018. Globalfoundries did not get the 10 nanometer processes working, and it botched a jump to 7 nanometers and spiked it, and that left IBM jumping to Samsung to be its first server chip partner for its foundry using its 7 nanometer processes. IBM took the opportunity of the Power10 delay to reimplement the Power ISA in a new Power10 core and then added some matrix math overlays to its vector units to make it a good AI inference engine.

IBM also created a beefier core and dropped the core count back to 16 on a die in SMT8 mode, which is an implementation of simultaneous multithreading that has up to eight processing threads per core, and also was thinking about an SMT4 design which would double the core count to 32 per chip. But we have not seen that today, and with IBM not chasing Google and other hyperscalers with Power10, we may never see it. But it was in the roadmaps way back when.

What IBM has done in the entry machines is put two Power10 chips inside of a single socket to increase the core count, but it is looking like the yields on the chips are not as high as IBM might have wanted. When IBM first started talking about the Power10 chip, it said it would have 15 or 30 cores, which was a strange number, and that is because it kept one SMT8 core or two SMT4 cores in reserve as a hedge against bad yields. In the products that IBM is rolling out today, mostly for its existing AIX Unix and IBM i (formerly OS/400) enterprise accounts, the core counts on the dies are much lower, with 4, 8, 10, or 12 of the 16 cores active. The Power10 cores have roughly 70 percent more performance than the Power9 cores in these entry machines, and that is a lot of performance for many enterprise customers – enough to get through a few years of growth on their workloads. IBM is charging a bit more for the Power10 machines compared to the Power9 machines, according to Steve Sibley, vice president of Power product management at IBM, but the bang for the buck is definitely improving across the generations. At the very low end with the Power S1014 machine that is aimed at small and midrange businesses running ERP workloads on the IBM i software stack, that improvement is in the range of 40 percent, supply or take, and the price increase is somewhere between 20 percent and 25 percent depending on the configuration.

Pricing is not yet available on any of these entry Power10 machines, which ship on July 22. When we find out more, we will do more analysis of the price/performance.

There are six new entry Power10 machines, the feeds and speeds of which are shown below:

For the HPC crowd, the Power L1022 and the Power L1024 are probably the most interesting ones because they are designed to only run Linux and, if they are like prior L classified machines in the Power8 and Power9 families, will have lower pricing for CPU, memory, and storage, allowing them to better compete against X86 systems running Linux in cluster environments. This will be particularly important as IBM pushed Red Hat OpenShift as a container platform for not only enterprise workloads but also for HPC and data analytic workloads that are also being containerized these days.

One thing to note about these machines: IBM is using its OpenCAPI Memory Interface, which as we explained in the past is using the “Bluelink” I/O interconnect for NUMA links and accelerator attachment as a memory controller. IBM is now calling this the Open Memory Interface, and these systems have twice as many memory channels as a typical X86 server chip and therefore have a lot more aggregate bandwidth coming off the sockets. The OMI memory makes use of a Differential DIMM form factor that employs DDR4 memory running at 3.2 GHz, and it will be no big deal for IBM to swap in DDR5 memory chips into its DDIMMs when they are out and the price is not crazy. IBM is offering memory features with 32 GB, 64 GB, and 128 GB capacities today in these machines and will offer 256 GB DDIMMs on November 14, which is how you get the maximum capacities shown in the table above. The important thing for HPC customers is that IBM is delivering 409 GB/sec of memory bandwidth per socket and 2 TB of memory per socket.

By the way, the only storage in these machines is NVM-Express flash drives. No disk, no plain vanilla flash SSDs. The machines also support a mix of PCI-Express 4.0 and PCI-Express 5.0 slots, and do not yet support the CXL protocol created by Intel and backed by IBM even though it loves its own Bluelink OpenCAPI interconnect for linking memory and accelerators to the Power compute engines.

Here are the different processor SKUs offered in the Power10 entry machines:

As far as we are concerned, the 24-core Power10 DCM feature EPGK processor in the Power L1024 is the only interesting one for HPC work, aside from what a theoretical 32-core Power10 DCM might be able to do. And just for fun, we sat down and figured out the peak theoretical 64-bit floating point performance, at all-core base and all-core turbo clock speeds, for these two Power10 chips and their rivals in the Intel and AMD CPU lineups. Take a gander at this:

We have no idea what the pricing will be for a processor module in these entry Power10 machines, so we took a stab at what the 24-core variant might cost to be competitive with the X86 alternatives based solely on FP64 throughput and then reckoned the performance of what a full-on 32-core Power10 DCM might be.

The answer is that IBM can absolutely compete, flops to flops, with the best Intel and AMD have right now. And it has a very good matrix math engine as well, which these chips do not.

The problem is, Intel has “Sapphire Rapids” Xeon SPs in the works, which we think will have four 18-core chiplets for a total of 72 cores, but only 56 of them will be exposed because of yield issues that Intel has with its SuperFIN 10 nanometer (Intel 7) process. And AMD has 96-core “Genoa” Epyc 7004s in the works, too. Power11 is several years away, so if IBM wants to play in HPC, Samsung has to get the yields up on the Power10 chips so IBM can sell more cores in a box. Big Blue already has the memory capacity and memory bandwidth advantage. We will see if its L-class Power10 systems can compete on price and performance once we find out more. And we will also explore how memory clustering might make for a very interesting compute platform based on a mix of fat NUMA and memory-less skinny nodes. We have some ideas about how this might play out.

Mon, 11 Jul 2022 12:01:00 -0500 Timothy Prickett Morgan en-US text/html
Killexams : ‘India is a perfect example of the application of open hybrid cloud’

You have seen the company grow from being just about enterprise Linux to becoming a multi-billion dollar open source enterprise products firm. Having stepped into Cormier’s shoes, are you planning any change in strategy?

The short answer is ‘No’. I’m pretty lucky that I have worked within about 20 feet of Paul for the last 10 years. So, I’ve had the opportunity to have a hand in the team we’ve built and the strategy we’ve built and the bets and positions we’ve made around open hybrid cloud. In my last role, I was heading all of our products and technology and business unit teams. Hence, I know the team and the strategy. And we will evolve. If we look at the cloud services market that’s moving fast, our commercial models will change there to make sure that as customers have a foot on prem (on premises) and in private cloud, we serve them well. As hybrid extends to edge (computing), it will also change how we approach that market. But our fundamental strategy around open hybrid cloud doesn’t change. So, it’s a nice spot to be here, where I don’t feel compelled to make any change, but focus more on execution. 

Tell us a bit about Red Hat’s focus on India, and your expansion plans in the country.

When we see the growth and opportunity in India, it mimics what we see in a lot of parts of the globe—software-defined innovation that is going to be the thing that lets enterprises compete. That could be in traditional markets where they’re leveraging their data centres; or it could be leveraging public cloud technologies. In certain industries, that software innovation is moving to the devices themselves, which we call edge. India is a perfect example of the application of open hybrid cloud because we can serve all of those use cases—from edge deployments in 5G and the adjacent businesses that will be built around that, to connectivity to the public clouds.

Correia (Marshall Correia is vice-president and general manager, India, South Asia at Red Hat): We have been operating in the country for multiple decades and our interest in India is two-fold. One is go-to-market in India, working with the Indian government, Indian enterprises, private sector as well as public sector enterprises. We have a global delivery presence in cities like Pune and Bengaluru. Whether you look at the front office, back office, or mid-office, we are deeply embedded into it (BSE, National Stock Exchange (NSE), Aadhaar, GST Network (GSTN), Life Insurance Corporation of India (LIC), SBI Insurance and most core banking services across India use Red Hat open source technologies). For instance, we work with Infosys on GSTN. So, I would say there is a little bit of Red Hat played out everywhere (in India) but with some large enterprises, we have a very deep relationship. 

Do you believe Red Hat is meeting IBM’s expectations? How often do you interact with Arvind Krishna, and what do you discuss?

About five years ago, Arvind and I were on stage together, announcing our new friendship around IBM middleware on OpenShift. I talk to him every few days. A lot of this credit goes to Paul. We’ve struck the balance with IBM. Arvind would describe it as Red Hat being “independent" (since) we have to partner with other cloud providers, other consulting providers, (and) other technology providers (including Verizon, Accenture, Deloitte, Tata Consultancy Services, and IBM Consulting). But IBM is very opinionated on Red Hat—they built their middleware to Red Hat, and we are their core choice for hybrid. Red Hat gives them (IBM) a technology base that they can apply their global reach to. IBM has the ability to bring open source Red Hat technology to every corner of the planet. 

How are open source architectures helping data scientists and CXOs with the much-needed edge adopting AI-ML (artificial intelligence and machine learning)?

AI is a really big space, and we have always sort of operated in how to get code built and (get it) into production faster. But now training models that can answer questions with precision are running in parallel. Our passion is to integrate that whole flow of models into production, right next to the apps that you’re already building today—we call this the ML ops (machine learning operations, which is jargon for a set of best practices for businesses to run AI successfully) space.

What that means is that we’re not trying to be the best in natural language processing (NLP) or building foundation AI models on it or convolutional neural networks (CNNs). We want to play in our sweet spot, which is how we arm data science teams to be able to get their models from development to production and time into those apps. This is the work we’ve done on OpenShift data science (managed cloud service for data scientists and developers) with it.

Another piece that’s changing and has been exciting for us, is hardware. As an example, cars today and going forward are moving to running just a computer in them. What we do really well is to put Linux on computers and the computer in your car, and the future will look very similar to the computer in your data centre today. And when we’re able to combine that platform, with bringing these AI models into that environment with the speed that you do with code with application integration, it opens up a lot of exciting opportunities for customers to get that data science model of building into the devices, or as close to customers as they possibly can.

This convergence is important, and it’s not tied to edge. Companies have realized that the closer they can push the interaction to the user, the better the experience it’s going to be.

And that could be in banking or pushing self-service to users‘ phones. In autonomous driving, it’s going to be pushing the processing down to your rear view mirror to make decisions for you. In mining, it might be 5g. At the core of it is how far can you push your differentiated logic closer to your consumer use case. That’s why I think we see the explosion in edge.

As a thought leader, I would like your views on trends like the decentralized web and open source metaverse.

If you look at the Red Hat structure, we have areas where we’re committed to businesses through our business units. But then we also have our office of technology that’s led by our CTO, Chris Wright, where we track industry trends where we haven’t necessarily taken a business stake or position but want to understand the technology behind it. The cryptographic blockchain decentralizing core technology foundations, which we watch very closely, is in this space right now. Because they do change the way you operate. It’s strikingly similar to how open source and coding practices are seen as normal today but when I started this 20 years ago, it was a much more connected and controlled experience versus a very decentralized one today. So, we track this very closely from a technology perspective (but) we haven’t yet taken a business position of this.

In this context, do you collaborate with IBM R&D too?

Yeah, we do. We worked closely with the IBM research team run by Dario Gil (senior VP and director of IBM Research) pre-acquisition, and we work even closer with them now. Post-acquisition, the focus on Red Hat and the clarity on IBM’s focus on open hybrid cloud have helped us collaborate even better.

Last, but not the least, what is Red Hat’s stance on the patent promise it made in September 2017, given that your company is now an IBM unit (which has over 70,000 active patents)?

We continue to collect our patents in a way that they won’t be leveraged against other users of open source. Red Hat will do it (patent) for the benefit of open source and to make the usage of open source a little safer. My patents, I believe, are included in that, and will continue to be included in that going forward.

Catch all the Corporate news and Updates on Live Mint. get The Mint News App to get Daily Market Updates & Live Business News.
More Less

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Mon, 08 Aug 2022 17:31:00 -0500 en text/html
Killexams : IBM (IBM) Q2 Earnings: What To Expect No result found, try new keyword!That answer will become more clear ... now accounts for more than 70% of total software revenue in the most accurate quarter. The market is now giving IBM more credit for its cloud transition. Sun, 17 Jul 2022 23:40:00 -0500 text/html Killexams : Big Blue Turns In A Solid Quarter For Systems

By all accounts, Big Blue had a pretty good quarter ending in June, with sales of its System z16 mainframes skyrocketing upwards as they do every couple of years at the beginning of a new cycle and sales of its high-end Power10 machines also getting some traction. If everything goes as planned, with the entry and midrange Power10 machines just launched and shipping at the end of this week, then the second half of 2022 should be a pretty good one for systems for IBM.

Nonetheless, Wall Street is giving IBM a bit of a drubbing as we go to press with this analysis of the company’s second quarter financial results, and that has more to do with the company shutting down its operations in Russia and the strength of the US dollar, which makes it more expensive to sell its products and services overseas. But a strong US dollar also makes every item sold overseas worth more dollars, which helps bolster revenues and profits. So there is that.

In the June quarter, IBM’s sales grew in a way that we have not seen in a long time, but also in a way we expected given the timing of the System z16 and Power10 product cycles. In a year and a half or so, unless IBM rolls out some upgraded processors – call them the z16+ and the Power10+, perhaps with refined chip 7 nanometer manufacturing processes from foundry partner Samsung as well as higher yields to boost the core count on the devices – then we expect for a pretty big lull in system sales at IBM.

The other good news on the systems front is that Red Hat keeps plugging along, although with only 12 percent growth this quarter, to $1.47 billion, that Linux and Kubernetes business is considerably off the 20-ish percent growth Red Hat was sustaining when IBM agreed to acquire it back in October 2018 for $34 billion. In our models, we reckon that about 70 percent of the Red Hat revenue stream is for datacenter infrastructure software, such as Enterprise Linux, OpenShift, OpenStack, and a few things like Java middleware and software-defined storage. And based on that estimate, we also reckon that Red Hat broke through $1 billion in sales for subscriptions and services for such software, the second time it has done that in its history. (The first time was in Q4 2021, with $1.05 billion in sales and in this quarter it was $1.03 billion.)

IBM always talks about its groups and divisions in terms of constant currency, which means reckoning their value without taking into account the exchange rate elevation IBM gets when it converts overseas sales to US dollars. We like to think in terms of as-reported figures, just to keep it all consistent, and are happy to keep in mind the constant currency sales.

Sales in the Infrastructure group, which includes servers and storage and technical support for both, rose by 19 percent to $4.24 billion, and gross profit was only up 12.1 percent to $2,28 billion, with overall gross margins for IBM’s hardware business at 53.8 percent. Pre-tax income was $757 million, or 17.9 percent of revenues, which is the kind of profitability that a software development house can usually expect. (Somewhere between 15 percent and 20 percent is typical.) Within this Infrastructure group, sales of hybrid infrastructure – what you and I would call hardware or systems, and it is still not clear what the heck “hybrid” means when a big bank buys a mainframe to keep track of our money – rose by 34.3 percent (according to our model), hitting $2.77 billion. IBM does not provide specific revenue figures for its system sales, but said that at constant currency, the System z mainframe division had sales rose by 77 percent, and that its Distributed Infrastructure division, which includes storage as well as Power Systems servers, had a more modest 17 percent uptick (again at constant currency). Storage was the big driver here, although Jim Kavanaugh, IBM’s chief financial officer, did say that IBM “had good performance in high-end Power10.” Infrastructure support revenues came in at $1.47 billion, down 2.1 percent.

IBM’s Hybrid Platforms & Solutions division within its Software group had $4.4 billion in sales, up a smidgen, and its Transaction Processing software division, which includes mainframe databases and middleware, had sales of $1.77 billion, up 11.6 percent year on year.

If you take chunks of its Software, Consulting, and Financing groups and allocate them to systems, as we have done for years, you can come up with a proxy for IBM’s overall “real” systems business, which we think is the thing that Wall Street and customers should be looking at and thinking about. We reckon that including that Red Hat systems business in the mix, IBM’s overall real systems business was just a tad under $8.2 billion, up 11.6 percent, and was $7.17 billion without those Red Hat subscriptions into the datacenter, up 11.5 percent.

As we projected many years ago, the Red Hat business has IBM’s overall systems business back on the track of growth, but remember that sales of IBM’s Power Systems and System z servers are very choppy, not the relatively smooth line that Red Hat is able to pull off each quarter – well, most quarters, anyway. The question we have is whether Red Hat can resume its historical growth rate of 20 percent, supply or take.

The answer is probably no, but it is possible if not probable. We shall see.

IBM did not say much about Power10 in the call, except what we noted above and that the entry and midrange machines launched on July 12 would start shipping this week.

Add it all up, and IBM had $15.54 billion in sales, up 9.3 percent, with gross profit of $8.29 billion, up 5.6 percent and representing 53.4 percent of revenues. Pre-tax income was up by 1.9X to $1.72 billion and net income was up by 1.72X to $1.39 billion.

Tue, 19 Jul 2022 08:43:00 -0500 Timothy Prickett Morgan en-US text/html
Killexams : How 'living architecture' could help the world avoid a soul-deadening digital future

(The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.)

(THE CONVERSATION) My first Apple laptop felt like a piece of magic made just for me – almost a part of myself. The rounded corners, the lively shading, the delightful animations. I had been using Windows my whole life, starting on my family’s IBM 386, and I never thought using a computer could be so fun.

Indeed, Apple co-founder Steve Jobs said that computers were like bicycles for the mind, extending your possibilities and helping you do things not only more efficiently but also more beautifully. Some technologies seem to unlock your humanity and make you feel inspired and alive.

But not all technologies are like this. Sometimes devices do not work reliably or as expected. Often you have to change to conform to the limitations of a system, as when you need to speak differently so a digital voice assistant can understand you. And some platforms bring out the worst in people. Think of anonymous flame wars.

As a researcher who studies technology, design and ethics, I believe that a hopeful way forward comes from the world of architecture. It all started decades ago with an architect’s observation that newer buildings tended to be lifeless and depressing, even if they were made using ever fancier tools and techniques.

Tech’s wear on humanity

The problems with technology are myriad and diffuse, and widely studied and reported: from short attention spans and tech neck to clickbait and AI bias to trolling and shaming to conspiracy theories and misinformation.

As people increasingly live online, these issues may only get worse. Some accurate visions of the metaverse, for example, suggest that humans will come to live primarily in virtual spaces. Already, people worldwide spend on average seven hours per day on digital screens – nearly half of waking hours.

While public awareness of these issues is on the rise, it’s not clear whether or how tech companies will be able to address them. Is there a way to ensure that future technologies are more like my first Apple laptop and less like a Twitter pile-on?

Over the past 60 years, the architectural theorist Christopher Alexander pursued questions similar to these in his own field. Alexander, who died in March 2022 at age 85, developed a theory of design that has made inroads in architecture. Translated to the technology field, this theory can provide the principles and process for creating technologies that unlock people’s humanity rather than suppress it.

How good design is defined

Technology design is beginning to mature. Tech companies and product managers have realized that a well-designed user interface is essential for a product’s success, not just nice to have.

As professions mature, they tend to organize their knowledge into concepts. Design patterns are a great example of this. A design pattern is a reusable solution to a problem that designers need to solve frequently.

In user experience design, for instance, such problems include helping users enter their shipping information or get back to the home page. Instead of reinventing the wheel every time, designers can apply a design pattern: clicking the logo at the upper left always takes you home. With design patterns, life is easier for designers, and the end products are better for users.

Design patterns facilitate good design in one sense: They are efficient and productive. Yet they do not necessarily lead to designs that are good for people. They can be sterile and generic. How, exactly, to avoid that is a major challenge.

A seed of hope lies in the very place where design patterns originated: the work of Christopher Alexander. Alexander dedicated his life to understanding what makes an environment good for humans – good in a deep, moral sense – and how designers might create structures that are likewise good.

His work on design patterns, dating back to the 1960s, was his initial effort at an answer. The patterns he developed with his colleagues included details like how many stories a good building should have and how many light sources a good room should have.

But Alexander found design patterns ultimately unsatisfying. He took that work further, eventually publishing his theory in his four-volume magnum opus, “The Nature of Order.”

While Alexander’s work on design patterns is very well known – his 1977 book “A Pattern Language” remains a bestseller – his later work, which he deemed much more important, has been largely overlooked. No surprise, then, that his deepest insights have not yet entered technology design. But if they do, good design could come to mean something much richer.

On creating structures that foster life

Architecture was getting worse, not better. That was Christopher Alexander’s conclusion in the mid-20th century.

Much modern architecture is inert and makes people feel dead inside. It may be sleek and intellectual – it may even win awards – but it does not help generate a feeling of life within its occupants. What went wrong, and how might architecture correct its course?

Motivated by this question, Alexander conducted numerous experiments throughout his career, going deeper and deeper. Beginning with his design patterns, he discovered that the designs that stirred up the most feeling in people, what he called living structure, shared certain qualities. This wasn’t just a hunch, but a testable empirical theory, one that he validated and refined from the late 1970s until the turn of the century. He identified 15 qualities, each with a technical definition and many examples.

The qualities are:

- Levels of scale

- Strong centers

- Boundaries

- Alternating repetition

- Positive space

- Good shape

- Local symmetries

- Deep interlocking and ambiguity

- Contrast gradients

- Roughness

- Echoes

- The void

- Simplicity and inner calm

- Notseparateness

As Alexander writes, living structure is not just pleasant and energizing, though it is also those. Living structure reaches into humans at a transcendent level – connecting people with themselves and with one another – with all humans across centuries and cultures and climates.

Yet modern architecture, as Alexander showed, has very few of the qualities that make living structure. In other words, over the 20th century architects taught one another to do it all wrong. Worse, these errors were crystallized in building codes, zoning laws, awards criteria and education. He decided it was time to turn things around.

Alexander’s ideas have been hugely influential in architectural theory and criticism. But the world has not yet seen the paradigm shift he was hoping for.

By the mid-1990s, Alexander recognized that for his aims to be achieved, there would need to be many more people on board – and not just architects, but all sorts of planners, infrastructure developers and everyday people. And perhaps other fields besides architecture. The digital revolution was coming to a head.

Alexander’s invitation to technology designers

As Alexander doggedly pursued his research, he started to notice the potential for digital technology to be a force for good. More and more, digital technology was becoming part of the human environment – becoming, that is, architectural.

Meanwhile, Alexander’s ideas about design patterns had entered the world of technology design as a way to organize and communicate design knowledge. To be sure, this older work of Alexander’s proved very valuable, particularly to software engineering.

Because of his fame for design patterns, in 1996 Alexander was invited to supply a keynote address at a major software engineering conference sponsored by the Association for Computing Machinery.

In his talk, Alexander remarked that the tech industry was making great strides in efficiency and power but perhaps had not paused to ask: “What are we supposed to be doing with all these programs? How are they supposed to help the Earth?”

“For now, you’re like guns for hire,” Alexander said. He invited the audience to make technologies for good, not just for pay.

Loosening the design process

In “The Nature of Order,” Alexander defined not only his theory of living structure, but also a process for creating such structure.

In short, this process involves democratic participation and springs from the bottom up in an evolving progression incorporating the 15 qualities of living structure. The end result isn’t known ahead of time – it’s adapted along the way. The term “organic” comes to mind, and this is appropriate, because nature almost invariably creates living structure.

But typical architecture – and design in many fields – is, in contrast, top-down and strictly defined from the outset. In this machinelike process, rigid precision is prioritized over local adaptability, project roles are siloed apart and the emphasis is on commercial value and investment over anything else. This is a recipe for lifeless structure.

Alexander’s work suggests that if living structure is the goal, the design process is the place to focus. And the technology field is starting to show inklings of change.

In project management, for example, the traditional waterfall approach followed a rigid, step-by-step schedule defined upfront. The turn of the century saw the emergence of a more dynamic approach, dubbed agile, which allows for more adaptability through frequent check-ins and prioritization, progressing in “sprints” of one to two weeks rather than longer phases.

And in design, the human-centered design paradigm is likewise gaining steam. Human-centered design emphasizes, among other elements, continually testing and refining small changes with respect to design goals.

A design process that promotes life

However, Alexander would say that both these trajectories are missing some of his deeper insights about living structure. They may spark more purchases and increase stock prices, but these approaches will not necessarily create technologies that are good for each person and good for the world.

Yet there are some emerging efforts toward this deeper end. For example, design pioneer Don Norman, who coined the term “user experience,” has been developing his ideas on what he calls humanity-centered design. This goes beyond human-centered design to focus on ecosystems, take a long-term view, incorporate human values and involve stakeholder communities along the way.

The vision of humanity-centered design calls for sweeping changes in the technology field. This is precisely the kind of reorientation that Alexander was calling for in his 1996 keynote speech. Just as design patterns suggested in the first place, the technology field doesn’t need to reinvent the wheel. Technologists and people of all stripes can build up from the tremendous, careful work that Alexander has left.

This article is republished from The Conversation under a Creative Commons license. Read the original article here:

Tue, 09 Aug 2022 07:04:00 -0500 en-US text/html
Killexams : Rochester's Big Blue workers to split between IBM, Kyndryl

IBM’s coming split into two separate companies by the end of 2021 will mean Rochester employees will soon be working for two different employers.

In October 2020, IBM announced that it would be splitting into two entities by the end of 2021. The second company, to be based in New York City, will go by the name of Kyndryl.

While many questions remain, some answers have come out about how this will impact what remains of IBM’s long-time presence in Rochester.

IBM in Rochester is a sliver of what it was from the late 1950s to the 1980s, although it is believed to remain a major employer in the city. The workforce has been eroded from more than 8,000 employees to an estimated 2,300 to 2,800 workers today. The company stopped releasing an exact local workforce number in 2008, when 4,200 employees were based here.

Of those remaining Big Blue teams, some will work for the classic IBM and some will be employed by Kyndryl, according to IBM's U.S. Markets and Regional Communications Manager Carrie Bendzsa.

The employees that will remain on the IBM roster include those who work on IBM i, Power, and Quantum computing as well as cloud and cognitive software and technology support services. An IBM Finance & Operations team and the IBM U.S. Patent center also will continue to be based at IBM in Rochester.

Staffers who work with the Managed Infrastructure Services team in Rochester will become Kyndryl employees, when it becomes an independent company.

“At Kyndryl, we create, modernize, run and manage the technology infrastructure the world depends on every day. That work continues at the Rochester, MN site,” stated Douglas Shelton, the former director of IBM Corporate Communications who now works for Kyndryl.

The Kyndryl workers will remain in Rochester in leased space on the former IBM campus.

In 2018, IBM sold its almost 500-acre Rochester campus with its 34 buildings for $33.9 million. IBM has a 12-year lease to occupy eight buildings on the east side of its former complex at 2900 37th St NW that is now called the Rochester Technology Campus.

EMBED: Jeff Pieters 'Sunrise Rochester' newsletter signup

Fri, 29 Jul 2022 23:00:00 -0500 en text/html
Killexams : IBM Touts AI, Hybrid Cloud: ‘Demand For Our Solutions Remains Strong’

Cloud News

Joseph F. Kovar

‘Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth,’ says IBM CEO Arvind Krishna.


A strengthening IT environment that is playing into IBM AI and hybrid cloud capabilities means a rosy future for IBM and its B2B business, CEO Arvind Krishna told investors Monday.

Krishna, in his prepared remarks for IBM’s second fiscal quarter 2022 financial analyst conference call, said that technology serves as a fundamental source of competitive advantage for businesses.

“It serves as both a deflationary force and a force multiplier, and is especially critical as clients face challenges on multiple fronts from supply chain bottlenecks to demographic shifts,” he said. “Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth.”


That plays well with IBM’s hybrid cloud and AI strategy where the company is investing in its offerings, technical talent, ecosystem, and go-to-market model, Krishna said.

“Demand for our solutions remains strong,” he said. “We continued to have double-digit performance in IBM Consulting, broad-based strength in software, and with the z16 [mainframe] platform launch, our infrastructure business had a good quarter. By integrating technology and expertise from IBM and our partners, our clients will continue to see our hybrid cloud and AI solutions as a crucial source of business opportunity and growth.”

Krishna said hybrid clouds are about offering clients a platform to straddle multiple public clouds, private clouds, on-premises infrastructures, and the edge, which is where Red Hat, which IBM acquired in 2019, comes into play, Krishna said.

“Our software has been optimized to run on that platform, and includes advanced data and AI, automation, and the security capabilities our clients need,” he said. “Our global team of consultants offers deep business expertise and co-creates with clients to accelerate their digital transformation journeys. Our infrastructure allows clients to take full advantage of an extended hybrid cloud environment.”

As a result, IBM now has over 4,000 hybrid cloud platform clients, with over 250 new clients added during the second fiscal quarter, Krishna said.

“Those who adopt our platform tend to consume more of our solutions across software, consulting, and infrastructure, [and] expanding our footprint within those clients,” he said.

IBM is also benefitting from the steady adoption by businesses of artificial intelligence technologies as those businesses try to process the enormous amount of data generated from hybrid cloud environments all the way to the edge, Krishna said. An IBM study released during the second fiscal quarter found that 35 percent of companies are now using some form of AI with automation in their business to address demographic shifts and move their employees to higher value work, he said.

“This is one of the many reasons we are investing heavily in both AI and automation,” he said. “These investments are paying off.”

IBM is also moving to develop leadership in quantum computing, Krishna said. The company currently has a 127-qubit quantum computer it its cloud, and is committed to demonstrate the first 400-plus-qubit system before year-end as part of its path to deliver a 1,000-plus-qubit system next year and a 4,000-plus-qubit system in 2025, he said.

“One of the implications of quantum computing will be the need to change how information is encrypted,” he said. “We are proud that technology developed by IBM and our collaborators has been selected by NIST (National Institute of Standards and Technology) as the basis of the next generation of quantum-safe encryption protocols.”

IBM during the quarter also move forward in its mainframe technology with the release of its new z16 mainframe, Krishna said.

“The z16 is designed for cloud-native development, cybersecurity resilience, [and] quantum-safe encryption, and includes an on-chip AI accelerator, which allows clients to reduce fraud within real-time transactions,” he said.

IBM also made two acquisitions during the quarter related to cybersecurity, Krishna said. The first was Randori, an attack surface management and offensive cybersecurity provider. That acquisition built on IBM’s November acquisition of ReaQta, an endpoint security firm, he said.

While analysts during the question and answer part of Monday’s financial analyst conference call did not ask about the news that IBM has brought in Matt Hicks as the new CEO of Red Hat, they did seem concerned about how the 17-percent growth in Red Had revenue over last year missed expectations.

When asked about Red Hat revenue, Krishna said IBM feels very good about the Red Hat business and expect continued strong demand.

“That said, we had said late last year that we expect growth in Red Hat to be in the upper teens,” he said. “That expectation is what we are going to continue with. … Deferred revenue accounts for the bulk of what has been the difference in the growth rates coming down from last year to this year.”

IBM CFO James Kavanaugh followed by saying that while IBM saw 17 percent growth overall for Red Hat, the company took market share with its core REL (Red Hat Enterprise Linux) and in its Red Hat OpenShift hybrid cloud platform foundation. Red Hat OpenShift revenue is now four-and-a-half times the revenue before IBM acquired Red Hat, and Red Hat OpenShift bookings were up over 50 percent, Kavanaugh said.

“So we feel pretty good about our Red Hat portfolio overall. … Remember, we‘re three years into this acquisition right now,” he said. “And we couldn’t be more pleased as we move forward.”

When asked about the potential impact from an economic downturn, Krishna said IBM’s pipelines remain healthy and consistent with what the company saw in the first half of fiscal 2022, making him more optimistic than many of his peers.

“In an inflationary environment, when clients take our technology, deploy it, leverage our consulting, it acts as a counterbalance to all of the inflation and all of the labor demographics that people are facing all over the globe,” he said.

Krishna also said IBM’s consulting business is less likely than most vendors’ business to be impacted by the economic cycle as it involves a lot of work around deploying the kinds of applications critical to clients’ need to optimize their costs. Furthermore, he said. Because consulting is very labor-intensive, it is easy to hire or let go tens of thousands of employees as needed, he said.

The Numbers

For its second fiscal quarter 2022, which ended June 30, IBM reported total revenue of $15.5 billion, up about 9 percent from the $14.2 billion the company reported for its second fiscal quarter 2021.

This includes software revenue of $6.2 billion, up from $5.9 billion; consulting revenue of $4.8 billion, up from $4.4 billion; infrastructure revenue of $4.2 billion, up from $3.6 billion; financing revenue of $146 million, down from $209 million; and other revenue of $180 million, down from $277 million.

On the software side, IBM reported annual recurring revenue of $12.9 billion, which was up 8 percent over last year. Software revenue from its Red Hat business was up 17 percent over last year, while automation software was up 8 percent, data and AI software up 4 percent, and security software up 5 percent.

On the consulting side, technology consulting revenue was up 23 percent over last year, applications operations up 17 percent, and business transformation up 16 percent.

Infrastructure revenue growth was driven by hybrid infrastructure sales, which rose 7 percent over last year, and infrastructure support, which grew 5 percent. Hybrid infrastructure revenue saw a significant boost from zSystems mainframe sales, which rose 77 percent over last year.

IBM also reported revenue of $8.1 billion from sales to the Americas, up 15 percent over last year; sales to Europe, Middle East, and Africa of $4.5 billion, up 17 percent; and $2.9 billion to the Asia Pacific area, up 16 percent.

Sales to Kyndryl, which late last year was spun out of IBM, accounted for about 5 percent of revenue, including 3 percent of IBM’s Americas revenue.

IBM also reported net income for the quarter on a GAAP basis of $1.39 billion, or $1.53 per share, up from last year’s $1.33 billion, or $1.47 per share.

Joseph F. Kovar

Joseph F. Kovar is a senior editor and reporter for the storage and the non-tech-focused channel beats for CRN. He keeps readers abreast of the latest issues related to such areas as data life-cycle, business continuity and disaster recovery, and data centers, along with related services and software, while highlighting some of the key trends that impact the IT channel overall. He can be reached at

Tue, 19 Jul 2022 13:17:00 -0500 en text/html
Killexams : Edge AI Software Market Status and Segments Forecast 2022-2027 | IBM, Imagimob, Microsoft, Nutanix,

Astute Analytica released a new research report on the global Edge AI Software Market The worldwide Edge AI Software Market report 2027 is a thorough investigation that examines the current Edge AI Software Market trends. The report consists of market definitions, market segmentation, end-use applications, and industry chain analysis. In addition, it offers a succinct overview of the market. The study on the global market offers an overview of the market encompassing the competitive environment, current market developments, and industry trends.

The global Edge AI Software market held a market value of USD 1,300.0 Million in 2020 and is forecasted to reach USD 8,049.8 Million by the year 2027. The market is expected to register a CAGR of 29.8% during the forecast period.

The competition study provides information about the major players in the Chinese market in terms of their financials, company profiles, product portfolios, and capacity. Along with the important development trends and sales channel research, the report also offers upstream raw material analysis and downstream demand analysis. The global research study also covers the investment opportunity areas.

Request To get demo of This Strategic Report: –

Despite the driving factors, security and privacy concerns coupled with vulnerability to cyber attacks are expected to hinder the market growth. During the COVID-19 pandemic, the healthcare segment using edge AI software witnessed positive growth as the software led to growing funding and research for keeping businesses safe and secure across the value chain.

Growth Influencers:

Advancements in AI powered IoT (Internet of Things) for intelligent systems and smart applications

Artificial intelligence sector is experiencing the emergence of a range of applications in various verticals. All these applications need massive computing power for performing activities, such as capture and process data in real time. While functioning with the cloud technology, artificial intelligence applications undergo various latency issues and lead to difficulties in offering quick responses. Edge AI software help in keeping the resources at the edge of the network. This helps the applications to work with high bandwidth and low latency. Hence, advancements in AI powered IoT (Internet of Things) for intelligent systems and smart applications are anticipated to boost the market growth.

Segments Overview:

The global Edge AI Software market is segmented into component, data source, application, and end-users.

By Component

  • Solutions
  • Software Tools
  • Platform
  • Services
  • Training and Consulting Services
  • System Integration and Testing
  • Support and Maintenance

The solutions segment is estimated to account for the largest market share of around 80% owing to its high demand. The services segment is expected to grow at the fastest CAGR of 30.6% owing to slowly increasing adoption of training and consulting services. The support and maintenance segment is held a market size of USD 337.3 million in 2020.

By Data Source

  • Biometric Data
  • Mobile Data
  • Sensor Data
  • Speech Recognition
  • Video and Image Recognition

The sensor data is anticipated to hold the largest market share of around 26% owing to high product availability in this segment. The biometric data is estimated to grow at the fastest rate of about 30.4% owing to growing adoption of biometric technology in various industries.

Download demo Report, SPECIAL OFFER (Avail an Up-to 30% discount on this report-

By Application

  • Access Control
  • Autonomous Vehicles
  • Energy Management
  • Predictive Maintenance
  • Remote Monitoring
  • Telemetry
  • Video Surveillance
  • Others

The energy management segment is expected to account for the dominant share of the market owing to growing number of edge AI software applications in this industry. The video surveillance segment is estimated to hit a market value of around USD 500 million by 2025.

By End-Users

  • Advanced Industries
  • Banking and Insurance
  • Chemicals and Agriculture
  • Consumer
  • Cross-Vertical
  • Energy and Materials
  • Healthcare
  • Infrastructure
  • Media and Entertainment
  • Public Sector and Utilities
  • Retail
  • Travel, Transport and Logistics

The travel, transport, and logistics segment accounted for the largest market share of about 21% owing to growing adoption of healthcare IT in the travel, transport, and logistics sector. The cross-vertical sector is estimated to witness a growth rate of around 30.3% and the consumer segment is expected to surpass a market value of around USD 262.4 million by 2025.

Regional Overview:

On a regional basis, the global Edge AI Software market is segmented into Europe, North America, Asia Pacific, Middle East & Africa, and South America.

The Asia Pacific region held the largest market share of about 38% owing to increasing adoption of technologically advanced solutions in the market and growing travel industry in countries, such as India and Japan. The North American witnessed a growth rate of around 30.9% owing to the growing healthcare IT industry in the U.S. and Canada.

Competitive Landscape:

Key players operating in the global Edge AI Software market include Alef Edge, Inc., Anagog Ltd., AWS, Azion Technologies, Bragi.Com, Chaos Prime, Inc., Clearblade, Inc., Foghorn Systems, Inc., Google, Gorilla Technology Group, Inc., IBM, Imagimob, Microsoft, Nutanix, Octonion, Sixsq Sarl, Synaptics, TACT.AI, TIBCO Software, Veea Inc., and other prominent players.

Major 2 players in the market hold about 25-30% of the market share. These players are engaged in product launches, mergers & acquisitions, and collaboration, among others. For instance, in January 2021, TIBCO Software, Inc. acquired Information Builders, hence advancing its connected intelligence platform by addition of Information Builders’ data management and analytics capabilities.

The global Edge AI Software market report provides insights on the below pointers:

  • Market Penetration: Provides comprehensive information on the market offered by the prominent players
  • Market Development: The report offers detailed information about lucrative emerging markets and analyzes penetration across mature segments of the markets
  • Market Diversification: Provides in-depth information about untapped geographies, accurate developments, and investments
  • Competitive Landscape Assessment: Mergers & acquisitions, certifications, product launches in the global Edge AI Software market have been provided in this research report. In addition, the report also emphasizes the SWOT analysis of the leading players.
  • Product Development & Innovation: The report provides intelligent insights on future technologies, R&D activities, and breakthrough product developments

The global Edge AI Software market report answers questions such as:

  • What is the market size and forecast of the Global Edge AI Software Market?
  • What are the inhibiting factors and impact of COVID-19 on the Global Edge AI Software Market during the assessment period?
  • Which are the products/segments/applications/areas to invest in over the assessment period in the Global Edge AI Software Market?
  • What is the competitive strategic window for opportunities in the Global Edge AI Software Market?
  • What are the technology trends and regulatory frameworks in the Global Edge AI Software Market?
  • What is the market share of the leading players in the Global Edge AI Software Market?
  • What modes and strategic moves are considered favorable for entering the Global Edge AI Software Market?

Request Full Report-

About Astute Analytica:

Astute Analytica is a global analytics and advisory company that has built a solid reputation in a short period, thanks to the tangible outcomes we have delivered to our clients. We pride ourselves in generating unparalleled, in-depth, and uncannily accurate estimates and projections for our very demanding clients spread across different verticals. We have a long list of satisfied and repeat clients from a wide spectrum including technology, healthcare, chemicals, semiconductors, FMCG, and many more. These happy customers come to us from all across the globe.

They are able to make well-calibrated decisions and leverage highly lucrative opportunities while surmounting the fierce challenges all because we analyze for them the complex business environment, segment-wise existing and emerging possibilities, technology formations, growth estimates, and even the strategic choices available. In short, a complete package. All this is possible because we have a highly qualified, competent, and experienced team of professionals comprising business analysts, economists, consultants, and technology experts. In our list of priorities, you-our patron-come at the top. You can be sure of the best cost-effective, value-added package from us, should you decide to engage with us.

Get in touch with us:

Phone number: +18884296757

Email[email protected]

Visit our website:

Thu, 04 Aug 2022 23:25:00 -0500 Newsmantraa en-US text/html
Killexams : Red Hat's next steps, according to its new CEO and chairman

In its latest quarter, IBM saw its hybrid-cloud revenue jump 18% to $5.9 billion. Along with this, IBM saw its highest sales growth in a decade. Much of that is due to its stand-alone Red Hat division. True, Red Hat sales increased by "only" 12%, which is low by Red Hat standards but darn good by any other standard. So what will Red Hat do now that it has a new CEO, Matt Hicks, and chairman, Paul Cormier? 

The answer: Stay the course.

In an interview, Hicks, who's been with Red Hat since 2006, said, "[We'll keep using] the same core fundamentals that we built 20-plus years ago." Why? Because the combination of Linux, open-source software, and top support, "continues to play in new markets, whether that's the shift to cloud and cloud services or to edge computing. In the next couple of quarters. we'll just focus on executing. There's great momentum right now around the open hybrid cloud." 

It's not just the cloud, though. Hicks continued, "We have a lot of opportunities. We're also working with General Motors on Ultifi, GM's end-to-end software platform, and two days ago, we announced a partnership with ABB, one of the world's leading manufacturing automation companies. It's pretty cool to see Linux and open source technologies being pulled into these totally new markets in the industry. So my job is not to change anything but keep us executing and capturing the opportunities ahead."

As for Cormier, who's moving from the CEO office to the board suite, he plans to be the first chairman in Red Hat's history who has an office here. He'll continue to push forward on Red Hat's hybrid computing initiatives. After all, Cormier observed, "I dare say we were the first software company to really start beating the drum for hybrid cloud." 

Looking ahead, Cormier will "work with a lot of our customers and partners that I've always worked with, but more intensely on helping them into the hybrid architecture." Sure, "I'm going to run our strategic advisory board and work with the management team in an advisory role, but I'll be very customer focused as well."

That's a much more active role than most chairmen, but Hicks is fine with it. "I've gotten to work with Paul for a decade," said Hicks. "Paul's given us an incredible structure and foundation with IBM and how we interact with IBM that I think will really be sustainable."

As for working with its distant parent company IBM, things will also stay the same. Cormier said, "The red lines were red, and the blue lines were blue, and that will stay the same." Hicks agreed. "It's critical, not just for Red Hat but for IBM as well that we continue to be market neutral."

Moving to the technical side, I asked about Red Hat and CentOS. Hicks replied, "I think it was a necessary shift and change. I'm a big believer in what makes open source work is the contribution cycle, and that wasn't happening with CentOS." 

Cormier added that Linux's biggest contribution to changing the world at first was accessibility. Now, however, he said, "This might be controversial, but I think what may be even bigger now is the innovation that drives it, and that needs contributions. Without it driving open source and Linux, the cloud wouldn't be here."

Red Hat also will continue to strive to lead the way in Linux and open-source security. Hicks said, "We'll continue to invest a lot in security. That was the foundation that Red Hat was built on. That you can get open-source innovation and deploy it with trust. Nothing has changed with that other than we certainly secure a lot more software today."

Due to SolarWinds and other software supply attacks, continued Hicks, "There's a better awareness that it must be addressed whether it's by a Secure Bill of Materials or various open-source security standards. We will continue to invest a lot in it. We haven't made specific product decisions other than knowing that this is a critical area for us to keep driving trust with customers."

Moving on to edge computing, both Hicks and Cormier feel that we've moved from a time when there was an obsessive focus on the edge itself to a more sensible and practical focus on what it can do for work. Hicks said, "It's our hope that we can bring platform continuity from the data center to the cloud to the edge devices without being in this embedded edge play. The economics of that aren't super interesting for us, but the connected economics, that's where we think there's more innovation and opportunity." 

Added Cormier, "In the old days of Linux, the financial analysts would sit in the back of the room with their calculators and work on how many servers times X dollars per server? It's not about that for Red Hat."

Instead, continued Hicks, as the two played off each other, "We expect to see an 800% increase in edge applications built by 2024. We want those applications to be part of the open hybrid cloud. We think we have a unique position to connect end devices back to the assets that you have in your data centers and cloud that you use to run your company today."

Put it together, and both leaders see good times ahead for Red Hat and its partners and customers. I think they're right.

Related Stories:

Tue, 19 Jul 2022 22:18:00 -0500 en text/html
M2040-642 exam dump and training guide direct download
Training Exams List