Thanks to 100% valid and up to date 000-771 practice test by give latest Pass4sure 000-771 exam dumps with Actual 000-771 test prep. Practice these Genuine Questions and Answers to Improve your insight and breeze through your 000-771 test with a great score. We ensure you 100% that if you memorize these 000-771 free pdf download and practice, You will pass with great score.

Exam Code: 000-771 Practice exam 2022 by team
IBM Tivoli Provisioning Manager with Orchestration V3.1 Implementation
IBM Implementation answers
Killexams : IBM Implementation answers - BingNews Search results Killexams : IBM Implementation answers - BingNews Killexams : Can IBM Get Back Into HPC With Power10?

The “Cirrus” Power10 processor from IBM, which we codenamed for Big Blue because it refused to do it publicly and because we understand the value of a synonym here at The Next Platform, shipped last September in the “Denali” Power E1080 big iron NUMA machine. And today, the rest of the Power10-based Power Systems product line is being fleshed out with the launch of entry and midrange machines – many of which are suitable for supporting HPC and AI workloads as well as in-memory databases and other workloads in large enterprises.

The question is, will IBM care about traditional HPC simulation and modeling ever again with the same vigor that it has in past decades? And can Power10 help reinvigorate the HPC and AI business at IBM. We are not sure about the answer to the first question, and got the distinct impression from Ken King, the general manager of the Power Systems business, that HPC proper was not a high priority when we spoke to him back in February about this. But we continue to believe that the Power10 platform has some attributes that make it appealing for data analytics and other workloads that need to be either scaled out across small machines or scaled up across big ones.

Today, we are just going to talk about the five entry Power10 machines, which have one or two processor sockets in a standard 2U or 4U form factor, and then we will follow up with an analysis of the Power E1050, which is a four socket machine that fits into a 4U form factor. And the question we wanted to answer was simple: Can a Power10 processor hold its own against X86 server chips from Intel and AMD when it comes to basic CPU-only floating point computing.

This is an important question because there are plenty of workloads that have not been accelerated by GPUs in the HPC arena, and for these workloads, the Power10 architecture could prove to be very interesting if IBM thought outside of the box a little. This is particularly true when considering the feature called memory inception, which is in effect the ability to build a memory area network across clusters of machines and which we have discussed a little in the past.

We went deep into the architecture of the Power10 chip two years ago when it was presented at the Hot Chip conference, and we are not going to go over that ground again here. Suffice it to say that this chip can hold its own against Intel’s current “Ice Lake” Xeon SPs, launched in April 2021, and AMD’s current “Milan” Epyc 7003s, launched in March 2021. And this makes sense because the original plan was to have a Power10 chip in the field with 24 fat cores and 48 skinny ones, using dual-chip modules, using 10 nanometer processes from IBM’s former foundry partner, Globalfoundries, sometime in 2021, three years after the Power9 chip launched in 2018. Globalfoundries did not get the 10 nanometer processes working, and it botched a jump to 7 nanometers and spiked it, and that left IBM jumping to Samsung to be its first server chip partner for its foundry using its 7 nanometer processes. IBM took the opportunity of the Power10 delay to reimplement the Power ISA in a new Power10 core and then added some matrix math overlays to its vector units to make it a good AI inference engine.

IBM also created a beefier core and dropped the core count back to 16 on a die in SMT8 mode, which is an implementation of simultaneous multithreading that has up to eight processing threads per core, and also was thinking about an SMT4 design which would double the core count to 32 per chip. But we have not seen that today, and with IBM not chasing Google and other hyperscalers with Power10, we may never see it. But it was in the roadmaps way back when.

What IBM has done in the entry machines is put two Power10 chips inside of a single socket to increase the core count, but it is looking like the yields on the chips are not as high as IBM might have wanted. When IBM first started talking about the Power10 chip, it said it would have 15 or 30 cores, which was a strange number, and that is because it kept one SMT8 core or two SMT4 cores in reserve as a hedge against bad yields. In the products that IBM is rolling out today, mostly for its existing AIX Unix and IBM i (formerly OS/400) enterprise accounts, the core counts on the dies are much lower, with 4, 8, 10, or 12 of the 16 cores active. The Power10 cores have roughly 70 percent more performance than the Power9 cores in these entry machines, and that is a lot of performance for many enterprise customers – enough to get through a few years of growth on their workloads. IBM is charging a bit more for the Power10 machines compared to the Power9 machines, according to Steve Sibley, vice president of Power product management at IBM, but the bang for the buck is definitely improving across the generations. At the very low end with the Power S1014 machine that is aimed at small and midrange businesses running ERP workloads on the IBM i software stack, that improvement is in the range of 40 percent, give or take, and the price increase is somewhere between 20 percent and 25 percent depending on the configuration.

Pricing is not yet available on any of these entry Power10 machines, which ship on July 22. When we find out more, we will do more analysis of the price/performance.

There are six new entry Power10 machines, the feeds and speeds of which are shown below:

For the HPC crowd, the Power L1022 and the Power L1024 are probably the most interesting ones because they are designed to only run Linux and, if they are like prior L classified machines in the Power8 and Power9 families, will have lower pricing for CPU, memory, and storage, allowing them to better compete against X86 systems running Linux in cluster environments. This will be particularly important as IBM pushed Red Hat OpenShift as a container platform for not only enterprise workloads but also for HPC and data analytic workloads that are also being containerized these days.

One thing to note about these machines: IBM is using its OpenCAPI Memory Interface, which as we explained in the past is using the “Bluelink” I/O interconnect for NUMA links and accelerator attachment as a memory controller. IBM is now calling this the Open Memory Interface, and these systems have twice as many memory channels as a typical X86 server chip and therefore have a lot more aggregate bandwidth coming off the sockets. The OMI memory makes use of a Differential DIMM form factor that employs DDR4 memory running at 3.2 GHz, and it will be no big deal for IBM to swap in DDR5 memory chips into its DDIMMs when they are out and the price is not crazy. IBM is offering memory features with 32 GB, 64 GB, and 128 GB capacities today in these machines and will offer 256 GB DDIMMs on November 14, which is how you get the maximum capacities shown in the table above. The important thing for HPC customers is that IBM is delivering 409 GB/sec of memory bandwidth per socket and 2 TB of memory per socket.

By the way, the only storage in these machines is NVM-Express flash drives. No disk, no plain vanilla flash SSDs. The machines also support a mix of PCI-Express 4.0 and PCI-Express 5.0 slots, and do not yet support the CXL protocol created by Intel and backed by IBM even though it loves its own Bluelink OpenCAPI interconnect for linking memory and accelerators to the Power compute engines.

Here are the different processor SKUs offered in the Power10 entry machines:

As far as we are concerned, the 24-core Power10 DCM feature EPGK processor in the Power L1024 is the only interesting one for HPC work, aside from what a theoretical 32-core Power10 DCM might be able to do. And just for fun, we sat down and figured out the peak theoretical 64-bit floating point performance, at all-core base and all-core turbo clock speeds, for these two Power10 chips and their rivals in the Intel and AMD CPU lineups. Take a gander at this:

We have no idea what the pricing will be for a processor module in these entry Power10 machines, so we took a stab at what the 24-core variant might cost to be competitive with the X86 alternatives based solely on FP64 throughput and then reckoned the performance of what a full-on 32-core Power10 DCM might be.

The answer is that IBM can absolutely compete, flops to flops, with the best Intel and AMD have right now. And it has a very good matrix math engine as well, which these chips do not.

The problem is, Intel has “Sapphire Rapids” Xeon SPs in the works, which we think will have four 18-core chiplets for a total of 72 cores, but only 56 of them will be exposed because of yield issues that Intel has with its SuperFIN 10 nanometer (Intel 7) process. And AMD has 96-core “Genoa” Epyc 7004s in the works, too. Power11 is several years away, so if IBM wants to play in HPC, Samsung has to get the yields up on the Power10 chips so IBM can sell more cores in a box. Big Blue already has the memory capacity and memory bandwidth advantage. We will see if its L-class Power10 systems can compete on price and performance once we find out more. And we will also explore how memory clustering might make for a very interesting compute platform based on a mix of fat NUMA and memory-less skinny nodes. We have some ideas about how this might play out.

Tue, 12 Jul 2022 07:21:00 -0500 Timothy Prickett Morgan en-US text/html
Killexams : Quantum Computing Becoming Real

Quantum computing will begin rolling out in increasingly useful ways over the next few years, setting the stage for what ultimately could lead to a shakeup in high-performance computing and eventually in the cloud.

Quantum computing has long been viewed as some futuristic research project with possible commercial applications. It typically needs to run at temperatures close to absolute zero, which means most people never actually will see this technology in action, which is probably a good thing because quantum computers today are still a sea of crudely connected cables. And so far, it has proven difficult to create enough qubits for a long enough period of time to be useful. But the tide appears to be turning, both for how to extend the lifetime of quantum bits, also known as qubits, as well as the number of qubits that are available.

A qubit is the unit of quantum information that is equivalent to the binary bit in classical computing. What makes qubits so interesting is that the 1s and 0s can be superimposed, which means that it can perform many calculations at the same time in parallel. So unlike a binary computer, where a bit is either 0 or 1, a quantum computer has three basic states—0, 1, and 0 or 1. In greatly oversimplified terms, that allows operations can be carried out using different values at the same time. Coupled with well-constructed algorithms, quantum computers will be at least as powerful as today’s supercomputers, and in the future they are expected to be orders of magnitude more powerful.

“The first applications will probably be in things like quantum chemistry or quantum simulations,” said Jeff Welser, vice president and lab director at IBM Research Almaden. “People are looking for new materials, simulating molecules such as drug molecules, and to do that you probably only need to be at around 100 qubits. We’re at 50 qubits today. So we’re not that far off. It’s going to happen within the next year or two. The example I give is the caffeine molecule, because it’s a molecule we all love. It’s a fairly small molecule that has 95 electrons. To simulate the molecule, you simulate the electron states. But if you were to exactly simulate the 95 electrons on that to actually figure out the energy state configuration, it would take 1048 classical bits. There are 1050 atoms in the planet Earth, so there’s no way you’re ever going to build a system with 1048 classical bits. It’s nuts. It would only require 160 qubits to do those all exactly, because the qubits can take on exactly all the quantum states and have all the right entanglements.”

Fig. 1: IBM’s 50Q system. Source: IBM

Exact numbers and timing tend to get a bit fuzzy here. While almost everyone agrees on the potential of quantum computing, the rollout schedule and required number of qubits is far from clear. Not all qubits are the same, and not all algorithms are created equal.

“There are two elements driving this technology,” said Jean-Eric Michellet, senior director of innovation and technology at Leti. “One is the quality of the qubit. The other is the number of qubits. You need to achieve both quality and quantity, and that is a race that is going on right now.”

These two factors are closely intertwined, because there also are two different types of qubits, logical and physical. The logical qubit can be used for programming, while the physical qubit is an real implementation of a qubit. Depending on the quality of the qubit, which is measured in accuracy and coherency time (how long a qubit lasts), the ratio of logical and physical qubits will change. The lower the quality, the more physical qubits are required.

Another piece is the quality of the quantum algorithms, and there is much work to be done here. “We’re still at the beginning of the software,” said Michellet. “This is a new way of doing algorithms.”

Yet even with the relatively crude algorithms and qubit technology today, quantum computing is beginning to show significant progress over classical computing methods.

“The performance (for quantum computing) is exponential in behavior,” said James Clarke, director of quantum hardware at Intel Labs. “For very small problems, you probably have quite a bit of overhead. When we measure traditional algorithms, there is some crossover point. A quantum algorithm is going to be exponentially faster. There are discussions in the community that this crossover point would be 50 qubits. We actually think it’s more like a thousand or so for certain types of optimization algorithms.”

Intel is working on two qubit technologies—superconducting qubits and spin qubits in CMOS. The company recently demonstrated a 49-qubit computer, based on superconducting technology it has code-named Tangle Lake.

Fig. 2: Intel’s 49-qubit Tangle Lake processor, including 108 RF gold connectors for microwave signals. Source: Intel

Faster sums of all fears
One of the key drivers behind quantum computing is a concern that it can be used to break ciphers that would take too long using conventional computers. The general consensus among security experts is that all ciphers can be broken with enough time and effort, but in the most secure operations that could take years or even decades. With a powerful quantum computer, the time could be reduced to minutes, if not seconds.

This has spawned massive investment by governments, universities, industry, and groups composed of all of those entities.

“Cryptography has really driven research at the government level around the world,” said Clarke. “The thought that security would be compromised is perhaps a worry but perhaps a goal for something like a quantum computer. Within this space, it actually requires a very powerful quantum computer that’s probably many years off.”

The more pressing question involves security measures that are in place today.

“Some, not all, cryptography algorithms will break with quantum computing,” said Paul Kocher, an independent cryptography and computer security expert. “We’re probably not at risk of things being broken over the next five years. But if you record something now, what happens in 30 years? With undercover operations you expect them to be secret for a long time. Something like AES 56 is not breakable any faster with a quantum computer than a conventional computer. The same is true for long keys. But with public keys like RSA and Diffie-Hellman key exchange, those could be broken. We’re still far away from building a quantum computer that could do that, but this could up-end the whole game on the PKI (public key infrastructure) front. Those things are suddenly at risk.”

Challenges remain
Today’s qubits are far from perfect. Unlike classical bits, they don’t exist for very long, and they aren’t completely accurate.

“That’s the major focus for quantum computing right now,” said IBM’s Welser. “It’s not only how to increase the number of qubits, which we know how to do just by continuing to build more of them. But how do you build them and get the error rates down, and increase coherency time, so that you can actually have time to manipulate those qubits and have them interact together? If you have 100 qubits, but the coherency time is only 100 microseconds, you can’t get them all to interact efficiently to do an real algorithm before they all have an error. In order to move forward, we talk about something called quantum volume, which then takes into account the number of qubits, the coherency time, the length of time they stay stable, and the number that can be entangled together. Those factors provide what we believe is the best way to compare quantum computers to each other.”

IBM is focused on quantum volume as the best path forward, but that point is the subject of debate in academic circles today. “Clearly, getting that coherency time to go longer is important,” Welser said. “But even at the level we’re at right now, we simulated three-atom molecules on our 7-qubit machine and showed that it works. They are error prone, but they get the right answer. You just run the simulation of thousand times—it takes very little time to do that—and then you take the probabilities that come out of that and you map out your answer.”

One of the key metrics here is the classic traveling salesman problem. If a salesman has a certain route to cover that involves multiple cities that are not in a straight line, what is the most efficient way to manage travel? There is no simple answer to this problem, and it has been vexing mathematicians since it was first posed in 1930. There have even been biological comparisons based upon how bees pollinate plants, because bees are known to be extremely efficient. But the bee studies conclude that complete accuracy isn’t critical, as long as it’s good enough.

And that raises some interesting questions about how computing should be done. Rather than exact answers to computational problems, the focus shifts to distributions. That requires less power, improves performance, and it works well enough for big problems such as financial valuation modeling.

Welser said banks already have begun exploring the quantum computing space. “They want to get started working on it now just to figure out what the right algorithms are, and understand how these systems will run and how to integrate them in with the rest of all of their simulations. But they’ll continue to use HPC systems, as well.”

Economies of efficiency
With the power/performance benefits of scaling classical computing chips diminishing at each node after 28nm, and the costs rising for developing chips at the latest process nodes, quantum computing opens up a whole new opportunity. That fact hasn’t been lost on companies such as IBM, Intel, Microsoft, Google and D-Wave Systems, all of which are looking to commercialize the technology.

Fig. 3: D-Wave’s quantum chips. Source: D-Wave

The big question is whether this can all be done using economies of scale in silicon manufacturing, which is already in place and well proven.

“These are larger chips,” said Intel’s Clarke. “For a small chip with 2 qubits, those we can wirebond to our package. Any larger than that, we are starting to do flip-chip bonding. Any larger than about 17, we are adding superconducting TSVs.”

That’s one way of approaching qubit manufacturing, but certainly not the only way. “We are also studying spin qubits in silicon,” Clarke said. “Basically, what we are doing is creating a silicon electron transistor. Instead of having a current through your channels, we trap a single electron in our channel. We put a magnet in the refrigerator. So a single electron in a magnetic field spins up or spins down. Those are the two states of the qubit. Why is this appealing? One of these qubits is a million times smaller than a superconducting qubit in terms of area. The idea that you can scale this to large numbers is perhaps more feasible.”

There are different business models emerging around the hardware. Intel’s is one approach, where it develops and sells quantum chips the way it does with today’s computer chips. IBM and D-Wave are building full systems. And Microsoft and Google are developing the algorithms.

Quality control
One of the big remaining challenges is to figure out what works, what doesn’t, and why. That sounds obvious enough, but with quantum computing it’s not so simple. Because results typically are distributions rather than exact answers, a 5% margin of error may produce good-enough results in one case, but flawed results in another.

In addition, quantum computing is highly sensitive to environmental changes. These systems need to be kept at a constant temperature near absolute zero, and noise of any kind can cause disruptions. One of the reasons these systems is so large is they provide insulation and isolation, which makes it hard to do comparisons between different machines.

On top of that, quality of qubits varies. “Because you use a qubit once in a certain configuration, will it behave the same in another? That can affect what is true and what is false,” said Leti’s Michellet. “And even if you have exactly the same state, are you running the same algorithms the next time? We need some tools here.”

And finally, the algorithms being used today are so basic that they will undoubtedly change. While this is generally a straightforward process with machine learning and AI to prune and weight those algorithms more accurately, when it comes to quantum computing the algorithms can leverage the superimposed capabilities of the qubits, adding multiple more dimensions. It’s not entirely clear at this point how those algorithms will evolve, but everyone involved in this space agrees there are big changes ahead.

Quantum computing is coming. How quickly isn’t clear, although the first versions of this technology are expected to begin showing up over the next few years, with the rollout across more markets and applications expected by the middle of the next decade.

While quantum computing is unlikely to ever show up in a portable device, it is a disruptive technology that could have broad implications both for the cloud and for edge devices that connect to these computers. For the chip industry in particular, it provides another path forward beyond device scaling where massive performance gains are not based on cramming more transistors onto a piece of silicon. And given the technical challenges chipmakers are facing at 3nm, the timing couldn’t be better.

—Mark LaPedus contributed to this report.

Related Stories
System Bits: June 5, 2018
Quantum computing light squeezing
Quantum Computing Breakthrough
Aluminum on sapphire qubits provide proof of concept for surface codes and digitized adiabatic evolution.
System Bits: Aug. 1, 2017
Quantum computing steps forward
What’s Next With Computing?
IBM discusses AI, neural nets and quantum computing.
Quantum Madness (blog)
Multiple companies focus on qubits as next computing wave, but problems remain.

Mon, 20 Jun 2022 12:00:00 -0500 en-US text/html
Killexams : Enterprise Application Market Size, Share Report, Forecast Between 2022 to 2028

New Jersey, United States – Enterprise Application Market 2022 – 2028, Size, Share, and Trends Analysis Research Report Segmented with Type, Component, Application, Growth Rate, Region, and Forecast | key companies profiled -IBM Corporation, Oracle Corporation, Microsoft Corporation, and others.

The Enterprise Application Market development is credited to the developing significance of business-situated undertaking applications, which can be altered for basic business necessities and sent on different stages across corporate organisations. Undertaking application arrangements combined with AI models empower endeavors to diminish vulnerability and settle on information-driven business choices. Venture application is exceptionally appropriate for advancing simpler correspondence, further developing business productivity, and setting out new open doors for a market extension. The developing interest among organizations for a solitary answer to assist them with tackling business issues drives the development of the endeavor application market.

According to our latest report, the Enterprise Application market, which was valued at US$ million in 2022, is expected to grow at a CAGR of approximate percent over the forecast period.

Receive the demo Report of Enterprise Application Market Research Insights 2022 to 2028 @

Organizations use undertaking application programming, which incorporates various applications like client relationship the board (CRM), business insight, store network the executives, and internet business frameworks and can be modified for explicit business needs and sent across corporate organisations on different stages. As firms develop, organisations request business-arranged endeavor applications that encode corporate strategies, rules, and cycles and are inherent in consistency with explicit business prerequisites. With an increase in the number of versatile clients in workplaces, these applications can assist representatives with performing better. This likewise helps with the improvement of correspondence, increments corporate effectiveness, and permits them to uncover new income creating market potential open doors.

Besides, venture application keeps on becoming standard, as additional enormous endeavors are thinking about the extraction capacities for getting significant experiences from huge information, which is shrewd for the Enterprise Application market. In any case, the little and medium scale venture section is supposed to become the most, and this pattern is supposed to go on during the figure time frame. The little and medium scale organizations are moving their business to a computerized stage, limited scope organizations are involving endeavor application answers for various purposes and it is assisting them with turning out to be more useful, more brilliant, and effective. In the UK, the CRM product segment held over 30% enterprise application market share in 2022, owing to the increasing uptake of CRM enterprise applications for the implementation of AI, big data, IoT, and connected devices. CRM solutions boost employee productivity, enhance customer engagement & retention, and offer various other business benefits.

Division Segment

Based on a product by industry vertical, the healthcare sector will account for a significant share of the CRM segment by 2028. Healthcare providers deal with a massive number of private and confidential datasets of their customers, thus requiring reliable CRM solutions for management. These solutions also help providers access customer data quickly, reducing the chances of casualties.
Based on Component, the market is segmented into Solutions and Services.

Based on Solution Type, the market is segmented into Supply Chain Management, Enterprise Resource Planning, Customer Relation Management, Business Process Management, Enterprise Asset Management, Business Intelligence, Content Management systems, and Others. The Solution segment dominated the enterprise application market with the largest revenue share in 2022. Enterprise Application Solutions are a collection of software programs that businesses utilize to Improve their operations and business effectiveness.

Access the Premium Enterprise Application market research report 2022 with a full index.

Regional Analysis
Based on Regions, the market is segmented into North America, Europe, Asia Pacific, and Latin America, Middle East & Africa. The Asia Pacific registered a promising revenue share in the enterprise application market in 2022. This is due to the widespread use of mobile enterprise application services and solutions by businesses. The significant rise in the APAC area is due to a number of causes, including massive economic advancements, globalization and foreign direct investments, increased smartphone penetration, and internet uptake in the workforce.

Competitors List

The market research report covers the analysis of key stakeholders of the market. Key companies profiled in the report include IBM Corporation, Oracle Corporation, Microsoft Corporation, SAP SE,, Inc., Hewlett-Packard Enterprise Company, IFS AB, QAD, Inc. (Thoma Bravo), Epicor Software Corporation, and Infor, Inc. (Koch Industries).

The following are some of the reasons why you should Buy a Enterprise Application market report:

  • The Report looks at how the Enterprise Application industry is likely to develop in the future.
  • Using Porter’s five forces analysis, it investigates several perspectives on the Enterprise Application market.
  • This Enterprise Application market study examines the product type that is expected to dominate the market, as well as the regions that are expected to grow the most rapidly throughout the projected period.
  • It identifies latest advancements, Enterprise Application market shares, and important market participants’ tactics.
  • It examines the competitive landscape, including significant firms’ Enterprise Application market share and accepted growth strategies over the last five years.
  • The research includes complete company profiles for the leading Enterprise Application market players, including product offers, important financial information, current developments, SWOT analysis, and strategies.

Click here to download the full index of the Enterprise Application market research report 2022

Contact Us:
Amit Jain
Sales Co-Ordinator
International: +1 518 300 3575
Email: [email protected]

Wed, 13 Jul 2022 00:45:00 -0500 Newsmantraa en-US text/html
Killexams : Will The Real UNIX Please Stand Up?
Ken Thompson and Dennis Ritchie at a PDP-11. Peter Hamer [CC BY-SA 2.0]
Ken Thompson and Dennis Ritchie at a PDP-11. Peter Hamer [CC BY-SA 2.0]
Last week the computing world celebrated an important anniversary: the UNIX operating system turned 50 years old. What was originally developed in 1969 as a lighter weight timesharing system for a DEC minicomputer at Bell Labs has exerted a huge influence over every place that we encounter computing, from our personal and embedded devices to the unseen servers in the cloud. But in a story that has seen countless twists and turns over those five decades just what is UNIX these days?

The official answer to that question is simple. UNIX® is any operating system descended from that original Bell Labs software developed by Thompson, Ritchie et al in 1969 and bearing a licence from Bell Labs or its successor organisations in ownership of the UNIX® name. Thus, for example, HP-UX as shipped on Hewlett Packard’s enterprise machinery is one of several commercially available UNIXes, while the Ubuntu Linux distribution on which this is being written is not.

When You Could Write Off In The Mail For UNIX On A Tape

The real answer is considerably less clear, and depends upon how much you view UNIX as an ecosystem and how much instead depends upon heritage or specification compliance, and even the user experience. Names such as GNU, Linux, BSD, and MINIX enter the fray, and you could be forgiven for asking: would the real UNIX please stand up?

You too could have sent off for a copy of 1970s UNIX, if you'd had a DEC to run it on. Hannes Grobe 23:27 [CC BY-SA 2.5]
You too could have sent off for a copy of 1970s UNIX, if you’d had a DEC to run it on. Hannes Grobe 23:27 [CC BY-SA 2.5]
In the beginning, it was a relatively contiguous story. The Bell Labs team produced UNIX, and it was used internally by them and eventually released as source to interested organisations such as universities who ran it for themselves. A legal ruling from the 1950s precluded AT&T and its subsidiaries such as Bell Labs from selling software, so this was without charge. Those universities would take their UNIX version 4 or 5 tapes and install it on their DEC minicomputer, and in the manner of programmers everywhere would write their own extensions and improvements to fit their needs. The University of California did this to such an extent that by the late 1970s they had released it as their own distribution, the so-called Berkeley Software Distribution, or BSD. It still contained some of the original UNIX code so was still technically a UNIX, but was a significant departure from that codebase.

UNIX had by then become a significant business proposition for AT&T, owners of Bell Labs, and by extension a piece of commercial software that attracted hefty licence fees once Bell Labs was freed from its court-imposed obligations. This in turn led to developers seeking to break away from their monopoly, among them Richard Stallman whose GNU project started in 1983 had the aim of producing an entirely open-source UNIX-compatible operating system. Its name is a recursive acronym, “Gnu’s Not UNIX“, which states categorically its position with respect to the Bell Labs original, but provides many software components which, while they might not be UNIX as such, are certainly a lot like it. By the end of the 1980s it had been joined in the open-source camp by BSD Net/1 and its descendants newly freed from legacy UNIX code.

“It Won’t Be Big And Professional Like GNU”

In the closing years of the 1980s Andrew S. Tanenbaum, an academic at a Dutch university, wrote a book: “Operating Systems: Design and Implementation“. It contained as its teaching example a UNIX-like operating system called MINIX, which was widely adopted in universities and by enthusiasts as an accessible alternative to UNIX that would run on inexpensive desktop microcomputers such as i386 PCs or 68000-based Commodore Amigas and Atari STs. Among those enthusiasts in 1991 was a University of Helsinki student, Linus Torvalds, who having become dissatisfied with MINIX’s kernel set about writing his own. The result which was eventually released as Linux soon outgrew its MINIX roots and was combined with components of the GNU project instead of GNU’s own HURD kernel to produce the GNU/Linux operating system that many of us use today.

It won't be big and professional like GNU" Linus Torvalds' first announcement of what would become the Linux kernel.
Linus Torvalds’ first announcement of what would become the Linux kernel.

So, here we are in 2019, and despite a few lesser known operating systems and some bumps in the road such as Caldera Systems’ attempted legal attack on Linux in 2003, we have three broad groupings in the mainstream UNIX-like arena. There is “real” closed-source UNIX® such as IBM AIX, Solaris, or HP-UX, there is “Has roots in UNIX” such as the BSD family including MacOS, and there is “Definitely not UNIX but really similar to it” such as the GNU/Linux family of distributions. In terms of what they are capable of, there is less distinction between them than vendors would have you believe unless you are fond of splitting operating-system hairs. Indeed even users of the closed-source variants will frequently find themselves running open-source code from GNU and other origins.

At 50 years old then, the broader UNIX-like ecosystem which we’ll take to include the likes of GNU/Linux and BSD is in great shape. At our level it’s not worth worrying too much about which is the “real” UNIX, because all of these projects have benefitted greatly from the five decades of collective development. But it does raise an interesting question: what about the next five decades? Can a solution for timesharing on a 1960s minicomputer continue to adapt for the hardware and demands of mid-21st-century computing? Our guess is that it will, not in that your UNIX clone in twenty years will be identical to the one you have now, but the things that have kept it relevant for 50 years will continue to do so for the forseeable future. We are using UNIX and its clones at 50 because they have proved versatile enough to evolve to fit the needs of each successive generation, and it’s not unreasonable to expect this to continue. We look forward to seeing the directions it takes.

As always, the comments are open.

Fri, 15 Jul 2022 12:00:00 -0500 Jenny List en-US text/html
Killexams : Best Of Breed 2021

XChange 2021

The Best of Breed (BoB) Conference meets the evolving needs of the IT channel’s largest, fastest-growing, and most progressive solution provider organizations and the top technology vendors and distributors. An invitation-only event, the BoB Conference brings together 100+ attendees from CRN’s elite solution provider lists ─ Solution Provider 500, Tech Elite 250 and Fast Growth 150 ─ to connect and engage over the course of 2 days. The in-person event features empowering CEO interviews, SP 500 solution provider spotlights, economic and market trend sessions, executive panel discussions and briefings, and peer-to-peer networking.

Get all of CRN's coverage of the event here and follow along on Twitter at #bob21.

    10 Big Cybersecurity Bets For 2022 From Optiv CEO Kevin Lynch
    From data governance, anti-ransomware and managed XDR to advisory services, managed implementation and faster delivery, here’s where Optiv CEO Kevin Lynch plans to place his bets in 2022.

    HPE CEO Antonio Neri’s 10 Boldest Statements From Best Of Breed 2021
    Neri talks about the growing importance of data, integrated platforms, and opportunities for partners in 5G, connectivity, and the HPE GreenLake edge-to-cloud platform.

    HPE CEO Neri: Steer Clear Of Public Cloud, Slash Costs By Up To 50 Percent
    One solution provider tells CRN that a customer is looking at that level of savings by moving workloads out of the public cloud and into HPE’s GreenLake platform.

    HPE CEO Antonio Neri: Dell Apex ‘Is VMware—It’s Not Dell’
    Neri says that the early version of the Dell Apex solution doesn’t offer as broad of a set of as-a-service solutions as HPE’s GreenLake offering, and is predominantly built around the VMware control plane.

    HPE CEO Antonio Neri To ‘Personally Lead’ Initiative To Boost GreenLake Experience
    The CEO of Hewlett Packard Enterprise says he will oversee a 30-person team working to enhance ‘all aspects of the experience’ involved with the GreenLake as-a-service consumption model.

    Hybrid Cloud Doesn’t Have To Be Intimidating, Says IBM
    ‘We can’t do this alone. With the $1 trillion opportunity around hybrid cloud, the only way we are going to succeed is with you,’ IBM’s Deepa Krishnan tells solution providers.

    Cisco CEO Chuck Robbins’ 10 Boldest Statements From Best Of Breed 2021
    ‘Software … allows us to move faster, innovate more quickly and allows the customer to actually get to the outcome faster. And if we get it right, it’s better for both our business models because [it provides] more predictability. But we have to get it right because it’s complicated to figure all that out,’ Cisco CEO Chuck Robbins tells an audience of solution providers.

    Ingram Micro’s Kirk Robinson: New Ownership Means New Channel Investment
    ‘Platinum [Equity] is in the business of making great companies greater, and they’re fully prepared to leverage their resources and experience to help Ingram Micro grow. And now that we’re U.S.-owned again, we have additional opportunities, not the least of which is in the public sector market where Platinum has proven experience,’ says Kirk Robinson, Ingram Micro’s chief country executive for the U.S.

    Ingram Micro Acquires CloudLogic In Big Cloud Services Play
    ‘CloudLogic not only advises on the best, most efficient, and effective way to run your customers’ applications, but they can also run reports on your customers’ technology, software licensing, and cloud spend through what we call IT Portfolio Optimization,’ says Kirk Robinson, Ingram Micro’s chief country executive for the U.S.

    Frank Vitagliano: ‘The Smart Money … Has Gravitated To The Distributors’
    ‘The smart money in the marketplace … has gravitated to the distributors. What they see is not only the value of what we’re doing today, but also the opportunity to enhance that,’ says Frank Vitagliano, CEO of the Global Technology Distribution Council.

    Cisco’s Chuck Robbins On What Subscriptions Mean For Partners: ‘It’s Better For Both Our Business Models’
    ‘One thing I’ve told the team all along is I don’t know what the solution is, but the answer is we have to do it with our partner community, and that’s just the way it is,’ Cisco CEO Chuck Robbins said at The Channel Company’s 2021 Best of Breed Conference.

    Cynet: Automate, Consolidate Security Functions With XDR
    ‘When you’re selling one consolidated XDR platform, it’s a single setup,. Your win rates are higher, and your margins are higher as well,’ says Royi Barnea, Cynet’s head of channel sales in North America.

    Customer Engagement Strategies Changing To Meet New Challenges
    ‘We are in definitely a hybrid place today of kind of a mix of digital and traditional. And that actually isn’t going away. And expectations of customers are absolutely going to continue to remain in that space, and they’re going to want to interact in that fashion,’ says Jade Surrette, chief marketing officer for The Channel Company.

    IBM CEO Arvind Krishna’s 10 Boldest Statements From Best Of Breed 2021
    IBM CEO Arvind Krishna talked Red Hat integration plans, supply chain issues, partner opportunities, and took aim at VMWare during a Q&A onstage at the Best of Breed Conference 2021.

    Arvind Krishna: IBM ‘Dead Serious’ About Partner Push; Upcoming Growth Has ‘Got To Be’ Through Channel
    IBM’s CEO tells partners at The Channel Company’s Best of Breed Conference that the IT giant has stepped up to invest in the channel, and offers major opportunities that include hybrid cloud, Red Hat solutions and security. ‘Now, let’s go grow the business together,’ Krishna said.

    IBM CEO Arvind Krishna: Chip Shortage ‘More Likely’ Continuing Until 2023 Or 2024
    Krishna said he sees any suggestion that a resolution could come by 2022 as ‘optimistic,’ and called upon the U.S. government to do more to support a larger return of semiconductor manufacturing to the country.

    ‘Geographically Diversify Manufacturing’ To Solve Supply Chain Crisis: Analyst
    ‘There’s no question we need to geographically diversify manufacturing. We absolutely have put too much dependence on Asia without having a more predictable macroeconomic and geopolitical relationship,’ says Daniel Newman of Futurum Research.

    Mon, 11 Oct 2021 10:15:00 -0500 en text/html Killexams : Bot Services Market Growing at a CAGR 33.2% | Key Player Microsoft, IBM, Google, Oracle, AWS
    Bot Services Market Growing at a CAGR 33.2% | Key Player Microsoft, IBM, Google, Oracle, AWS

    “Microsoft (US), IBM (US), Google (US), Oracle (US), AWS (US), Meta (US), Artificial Solutions (Sweden), eGain (US), Baidu (China), Inbenta (US), Alvaria (US), SAP (Germany), Creative Virtual (UK), Gupshup (US), Rasa (US), Pandorabots (US), Botego (US), Chatfuel (US), Pypestream (US), Avaamo (US), Webio (Ireland), ServisBOT (US).”

    Bot Services Market by Service Type (Platform & Framework), Mode of Channel (Social Media, Website), Interaction Type, Business Function (Sales & Marketing, IT, HR), Vertical (BFSI, Retail & eCommerce) and Region – Global Forecast to 2027

    The Bot Services Market size to grow from USD 1.6 billion in 2022 to USD 6.7 billion by 2027, at a Compound Annual Growth Rate (CAGR) of 33.2% during the forecast period. Various factors such as rise in the need for 24X7 customer support at a lower operational cost, integration of chatbots with social media to augment marketing strategy, and innovations in AI and ML technologies for chatbots resulting in better customer experience are expected to drive the adoption of bot services.

    Download PDF Brochure:

    According to Microsoft, Azure Bot Service provides an integrated development environment for bot building. Its integration with Power Virtual Agents, a fully hosted low-code platform, enables developers of all technical abilities to build conversational AI bots without the need for any further coding. The integration of Azure Bot Service and Power Virtual Agents enables a multidisciplinary team with a range of expertise and abilities to build bots inside a single software as a service (SaaS) solution.

    Healthcare and Life Sciences vertical to witness the highest CAGR during the forecast period

    The segmentation of the bot services market by vertical includes BFSI, retail & eCommerce, healthcare & life sciences, media & entertainment, travel & hospitality, IT & telecom, government, and others (automotive, utilities, education and real estate). The healthcare industry is developing rapidly due to many major technological advancements to enhance the overall patients experience. Hospitals and other health institutions are increasingly adopting bot services to Improve the overall experience of patients, doctors, and other staff. Additionally, bot services can enhance patient experience and build patient loyalty, while improving organizational efficiency. Moreover, bots, also known as virtual health assistants, notify patients about their medication plan, address concerns, deliver diagnosis reports, educate them regarding certain diseases, motivate them to exercise, and personalize user experience.

    Some major players in the bot services market include Microsoft (US), IBM (US), Google (US), Oracle (US), AWS (US), Meta (US), Artificial Solutions (Sweden), eGain (US), Baidu (China), Inbenta (US), Alvaria (US), SAP (Germany), (Netherlands), Creative Virtual (UK), (US), [24] (US), Gupshup (US), Rasa (US), Pandorabots (US), Botego (US), Chatfuel (US), Pypestream (US), Avaamo (US), Webio (Ireland), ServisBOT (US), (India), Cognigy (Germany), Enterprise Bot (Switzerland), Engati (US), and Haptik (US). These players have adopted various organic and inorganic growth strategies, such as new product launches, partnerships and collaborations, and mergers and acquisitions, to expand their presence in the global bot services market.

    Request demo Pages:

    Artificial Solutions (Sweden) is a leading specialist in Conversational AI solutions and services. The solution offered by the company enables communication with applications, websites, and devices in everyday, human-like natural language via voice, text, touch, or gesture inputs. Artificial Solutions’ conversational AI technology makes it easy to build, implement, and manage a wide range of natural language applications, such as virtual assistants, conversational bots, and speech-based conversational UIs for smart devices. Artificial Solutions offers bot services and solutions to various industries, such as financial services, retail, automotive, telecom, energy and utilities, travel and leisure, and entertainment. Artificial Solutions has won several awards, such as the 2019 Stevie Awards for Sales and Customer Service, the 2018 Speech Industry Awards, and the 2018 AICONICS: Best Intelligent Assistant Innovation. The company’s major customers include AT&T, Shell, Vodafone, TIAA, Volkswagen Group, Deutsche Post, Widiba, Telenor Group, Accenture, KPMG, Cognizant, Wipro, and Publicis Sapient. It has development centers in Barcelona, Hamburg, London, and Stockholm and offices across Europe, Asia Pacific, and South America.

    In the bot services market, it provides Teneo, a platform that enables business users and developers to collaborate to create intelligent conversational AI applications. These applications operate across 35 languages, multiple platforms, and channels in record time.

    eGain (US) is a leading cloud customer engagement hub software supplier. eGain products have been used to Improve customer experience, streamline service processes, and increase revenue across the online, social media, and phone channels for over a decade. eGain helps hundreds of the worlds leading organizations turn their disjointed sales and customer service operations into unified customer engagement hubs (CEHs). In North America, Europe, the Middle East, Africa, and Asia Pacific, eGain Corporation develops, licenses, implements, and supports customer service infrastructure software solutions. It offers a unified cloud software platform to automate, augment, and orchestrate consumer interactions. It also provides subscription services, which give users access to its software via a cloud-based platform, as well as professional services, including consultation, implementation, and training. The company caters to the financial services, telecommunications, retail, government, healthcare, and utilities industries.

    In the bot services market, the company offers AI Chatbot Virtual Assistant software which improves customer engagement. The VA acts as a guide, helping customers navigate the website and taking them to the relevant places on a page. The virtual assistant provides answers to any queries, even helping in making shopping decisions.

    Baidu (China) provides internet search services. It is divided into two segments: Baidu Core and iQIYI. The Baidu app helps customers to access search, feed, and other services through their mobile devices. Baidu Search helps users to access the companys search and other services. Baidu Feed gives users a customized timeline based on their demographics and interests. The company provides products, including Baidu Knows, an online community where users can ask questions to other users; Baidu Wiki; Baidu Healthcare Wiki; Baidu Wenku; Baidu Scholar; Baidu Experience; Baidu Post; Baidu Maps, a voice-enabled mobile app that provides travel-related services; Baidu Drive; Baijiahao; and DuerOS, a smart assistant platform. The company also provides online marketing services such as pay for performance, an auction-based service that enables customers to bid for priority placement of paid sponsored links and reach users searching for information about their products or services. Other marketing services offered by the company are display-based marketing services and other online marketing services based on performance criteria other than cost per click. The company offers a mobile ecosystem, which includes Baidu A, a portfolio of applications. Further, the company provides iQIYI, an online entertainment service, including original and licensed content; video content and membership; and online advertising services.

    In the bot services market, Baidu offers Baidu Bot, a search bot software used by Baidu, which collects documents from the web to build a searchable index for the Baidu search engine.

    Media Contact
    Company Name: MarketsandMarkets™ Research Private Ltd.
    Contact Person: Mr. Aashish Mehra
    Email: Send Email
    Phone: 18886006441
    Address:630 Dundee Road Suite 430
    City: Northbrook
    State: IL 60062
    Country: United States

    Wed, 13 Jul 2022 10:01:00 -0500 GetNews en-US text/html
    Killexams : Is OSGi the Solution for Mobile Java? The 2007 JavaOne conference reflected the fact that mobile computing—for both consumers and enterprise workers—is transitioning from early adoption to the mass market. But Java ME developers still face many obstacles that server-side or desktop Java developers never have to contend with. Those issues include:
    • fragmentation of the Java ME platform
    • the absence of mobile runtime environments that adequately leverage the capabilities of advanced "smartphone" devices
    • the difficulty of managing mobile applications and configurations once the device has left the building
    • the architectural chasm that separates common Java web development skills and APIs from the specialized rich client practices employed when developing for mobile devices.

    Nokia, Sprint, and IBM teamed for a JavaOne session that outlined a solution to these problems through an service-oriented architecture based on OSGi. OSGi was originally developed for telematics applications where remote management, pluggability, and "hot" software and firmware updates (no restarts) were required. As mobile handsets increasingly are used as always-on application platforms, and particularly as carriers and developers start to grapple with how to bring some of the dynamism of loosely coupled component architectures to the currently static mobile Java environment, handsets become a vast new frontier for OSGi.

    The specification for the standard under which this work is taking place is JSR 232: Mobile Operational Management. As Jon Bostrom, Chief Java Architect for Mobile Software at Nokia put it, the vision for this is based on the Web 2.0 principle of "innovation in assembly": to bring a open component model in which services built into the platform can be plugged together with others provided by developers in a flexible, but highly manageable manner. OSGi turns the device into an OS agnostic application server in your pocket. It enables new components to be delivered on the fly, manages their lifecycle and permissions, and provides a shared event bus as well as services like monitoring and logging. In fact, since OSGi includes a servlet container, OSGi bundles (pluggable components) don't necessarily have to be written as Java ME applications—they can be standard servlets living on the edge of the network.

    From this broad vision different participants in the JSR 232 working group seem to be moving in somewhat different directions. For Nokia, for example, the idea is to create a "mobile innovation engine" that promotes mobile mashups. These will not just be mashups of web services that we're familiar with today. They could include components like alternative GUI rendering engines that enable easy ports from other platforms, for example. Mobile Java developers have long been constrained by the limited UI toolkit that CLDC/MIDP provides. On a device with JSR 232 they will have a much more powerful CDC/FP runtime and class libraries to work with, making AGUI and Swing GUIs a possibility, a well as the embedded versions of SWT and the Eclipse Rich Client Platform (eRCP). According to Bostrom there is no reason that engines like ActionScript/Flash, OpenLaszlo or Flex/Apollo could not be plugged in as well. Most importantly, the developer doesn't need to wait for the Java Community Process or device manufacturers to bring these components to the handset: once wrapped into an OSGi bundle they can be installed to the device over the air and registered as a service, much like an Eclipse plugin is installed via Update Manager. In fact, OSGi is the technology that makes this possible in Eclipse without having to restart the workbench after the installation. This should be a very exciting prospect for mobile developers.

    For IBM, having a server in your pocket suggests an ever broader view: what IBM Distinguished Engineer Jim Colson called a "symmetric portal model." Here OSGi enables service layers that literally span everything from sensors, to smartphones, laptops and desktops and in each case these services are accessed via familiar technologies like JMS and servlets. That unified architecture has several advantages. Obviously, it opens up mobile devices to a very large group of developers with skills that are common in enterprise IT departments. It also addresses a problem that has hindered the use of standard web technologies on devices: the limited coverage and high latency of wireless networks. In IBM's OSGi-based Lotus Expeditor managed client software an application can be run client/server even in disconnected mode. IBM considers Lotus Expeditor to be an "open alternative to .NET" that spans Windows, Windows Mobile, Nokia S60 and Mac OS. Just like Nokia's mobile OSGi implementation, IBM's enables rich client apps using pluggable GUI libraries and encourages development by composition. But the business proposition is extending existing SOA technologies "beyond the data center to people, places and things."

    OSGi opens the door to a more dynamic mobile Java environment, and Nokia's Asko Komsi states that this together with OSGi services like configuration, monitoring and conditional privileges "provide a lot of the features that I think we need to make CDC usable." But it also raises questions that will need to be resolved by the JCP in specifications such as JSR 249, Advanced Mobile Service Architecture. Komsi explains:

    On a high level, JSR 249 has to find solutions for key features like the primary installation mechanism, the application model, and the packaging model—how to package those applications and middleware components so that you can send them to handsets. Additionally, JSR 249 needs to find a solution for managing the environment and the applications and services running on it. In the future you will also have powerful client environments that will allow you to run multiple applications. So we also need to define an application cooperation mechanism. These are features for which we have to find a solution in JSR 249. If we don't have them, then we're only half done, and we might be faced with fragmentation once again.

    But Nokia is not waiting any longer to get OSGi into your mobile. They have teamed up with Sprint to develop a JSR 232 implementation they call the Titan platform that will be shipping very soon. Brandon Annan, Manager of Software Platforms for Sprint's 3G Customer Equipment group, predicted a "Golden Age of mobile Java technology" that would begin in the 4th quarter of this year with the launch of three or four Titan-enabled "PDAs" (presumably with EVDO radios). JSR 232 handsets will follow in mid-2008. Sprint's 4G product division is also "seriously considering" JSR 232 for the WiMax devices they will be releasing for that eagerly anticipated network rollout. Outside of Sprint the upcoming Nokia E90 handset will also be shipping to the European market with OSGi and eRCP on board. Whether this premium smartphone sees North American shores is not yet clear.

    Mobile Java developers that can't wait til later this year can start deploying applications on CDC and OSGi today, but they need to choose devices with open operating systems like Windows Mobile and Linux. Brian Coughlin, the Senior Technology Strategist for Sprint's 4G group put together an OSGi mobile mashup demo on the Nokia N800 Internet Tablet, for example, a Linux device for which Nokia has suggested there will be a WiMax version. Developers can get a CDC Java runtime for the N800 and the other components, including an Equinox OSGi implementation and the mashup servlet bundle here.

    Wed, 15 Jun 2022 12:00:00 -0500 en text/html
    Killexams : Quantum banking: The next leap in financial services

    Quantum computers have now probably reached the point at which they can carry out calculations that would be impossible even using the world’s fastest conventional supercomputers. But the Zuchongzhi quantum computer is claimed to be 10 million times faster than the current fastest supercomputer and the photon-based Jiuzhang 2 can calculate in one millisecond a (very specific) task that would, it is said, take the world’s fastest conventional computer 30 trillion years.

    However, the real significance of the advances in quantum computing come not so much from quantum primacy in niche scientific tasks, but from what Dr. Darío Gil, senior vice president and director of research at IBM, calls ‘quantum [business] advantage’. That is where quantum computers would consistently outperform conventional computers on problems that are useful for companies. This requires commercially viable hardware and software that can deliver value in real-world business use cases. In industries like banking, the potential is enormous, and we are closer to this transformation than you may think.

    While quantum technology is not at the point where any data scientist could use this kind of technology day-to-day, the potential impact on banking is huge

    So, what kinds of real-world banking challenges may benefit from a quantum solution? Spain’s CaixaBank has been exploring this question since 2018, when it set up a team of experts with IT technicians, mathematicians, and risk analysts. This team identified areas such as overall risk assessment and tail risk simulators, fraud detection with artificial intelligence and machine learning, quantum safe cryptography, portfolio selection and allocation, and data mining optimisation, among other areas.

    The bank has worked with IBM’s Qiskit simulator and 16-qubit quantum computer on complex multi-variate analysis – determining that quantum computing was indeed a step forward over conventional technologies. It has tested with the PennyLane quantum computing framework from Xanadu and tested its use in machine learning, particularly in defining a scoring model for risk assessment. And it has explored the use of high-performance quantum random number generation solutions in simulations, alongside QUSIDE Technologies, a Spanish spin-off from the Institute of Photonic Sciences, a government research centre in Catalonia.

    These partnerships include experiments on real data and real risk management problems. In one example, CaixaBank explored the implementation of a quantum algorithm capable of assessing the financial risk of two portfolios created specifically for the project, one consisting of mortgages and the other, treasury bills.

    The quantum algorithm reached the same conclusions as the traditional method, using just a few dozen simulations, versus the thousands or millions required by traditional methods. This speed advantage results in cost savings in compute time, faster and more accurate risk management, and, in the case of an application in, say, derivatives, could mean being able to price complex products for customers in real-time rather than needing several hours.

    Quantum computers as they currently exist could be much better at some kinds of problem-solving than others. One area in which they excel is optimisation, and here a technique known as quantum annealing can be used to tackle real-life problems, including portfolio selection and allocation. Again, CaixaBank has been a pioneer of this in the banking sector, with its life insurance and pensions’ subsidiary, VidaCaixa, working with Canadian D-Wave Systems Inc., leaders in the development of quantum annealing processors.

    The most latest fruits of this collaboration came in 2022, when CaixaBank became one of the first banks in the world to apply quantum computing to investment portfolio hedging calculation in the insurance sector. Using D-Wave’s Leap quantum cloud service, CaixaBank Group’s team coded an algorithm, which markedly reduced the computing time necessary to reach an optimal solution for improving investment portfolio hedging.

    What normally took the bank several hours of compute time, was reduced to just minutes – decreasing the compute time over the traditional solution by up to 90%. This reduction of compute time facilitates increased modelling complexity, allowing for a more dynamic model that is better adapted to real-time markets; it improved the hedging decision-making process and it made capital allocations necessary for hedging operations more efficient.

    These types of study suggest that banks could already use quantum computing to gain a competitive edge over their peers in specific areas. So how does this square with the conventional wisdom that we are between five and 10 years away from commercially viable quantum computers for general purposes?

    The answer is in those ‘specific areas’. Spanish quantum start-up, Multiverse Computing, says that banks can already get a 100-fold advantage by using quantum computers to solve narrow problems, such as portfolio optimisation and fraud detection, even though today’s quantum computers are still relatively limited, with less than 100 qubits and high error rates.

    So, quantum annealing produces good results in optimisation, whilst quantum gate technology, such as IBM’s machines, most notably outperform in tasks such as machine learning. And there are problems, such as using Monte Carlo simulations to price complex financial instruments, where traditional computers are still a better solution.

    CaixaBank’s view is that, while quantum technology is not at the point where any data scientist could use this kind of technology day-to-day, the potential impact on banking is huge. This means that quantum computing should be a key tenet of any banking technology strategy today, and that banks which aspire to leadership in this digital age must embrace quantum technology as part of their innovation agenda.

    Banks who start this work even before the machines have reached maturity will be in a far better position to utilise the bigger and faster machines as they become available. Being quantum ready is a critical head start. That is why CaixaBank chose to be one of the first banks in the world to incorporate quantum computing into its innovation activity.

    Read more articles from CaixaBank in this series:

    Sun, 19 Jun 2022 12:00:00 -0500 en text/html
    Killexams : How Tomago made SAP HANA sing

    At the Tomago Aluminium Company plant in New South Wales, production has continued 24 hours a day since 1983 – and an SAP ERP system plays a part in keeping production running.

    Read this success story to learn how Tomago made its SAP HANA implementation sing performance-wise.

    It worked with Advent One to migrate its SAP HANA platform from the public cloud to a private cloud running on IBM Power Systems. Advent One were able to complete the migration, leveraging automation, within 6 days.

    This was one of the most successful projects Tomago ever had, its CFO told Advent One.

    Tomago gained the performance and flexibility its old public cloud solution lacked. That’s benefitted the company’s finance department and helped staff members make faster and more informed decisions. And Tomago’s IT Superintendent now has a “very stable and resilient system”.

    Downloading the case study, provided by Advent One. It covers the:

    • challenges that led to the SAP migration
    • underlying IBM infrastructure 
    • rapid deployment and close teamwork required
    • key business outcomes

    This content has been created and paid for by Advent One & IBM.

    Business Email *

    Please enter your business email.Please enter a valid email address.

    First Name *

    Please enter your first name.

    Last Name *

    Please enter your last name.

    Phone Number *

    Please enter your phone number.Please enter a valid phone number.

    Job Title *

    Please enter your job title.

    Company Name *

    Please enter your company name.

    Postcode *

    Please enter your postcode.Invalid postcode.

    Country *

    Please select your country.

    Job (Role) *

    Please select your job (role).

    Management Level *

    Please select Management Level.

    Company Size *

    Please select your company's employee size.

    What is your organisation's primary industry? *

    Organisation's primary industry required

    Mon, 01 Nov 2021 20:51:00 -0500 text/html
    Killexams : Long Read on the state of quantum computing in China The authors of this article, Josie-Marie Perkuhn, Tania Becker, Nancy Wilms and Sven Pabis, are a group of scientists from various German universities who are collaborating on the online magazine “chinnotopia – Future designed by China”, which provides an introduction to highly diverse aspects of Chinese innovation culture. This in-depth description of quantum computing offers us some insight into a development that could change the world from the ground up.

    The development of quantum computers is currently taking centre stage in science and politics worldwide, as well as being a matter of general public interest. Global players in the field of research are mostly to be found in the USA, China and Europe. Because they haven’t yet standardised the development and implementation of industry norms at this stage of research, there will not be one single universal quantum computer, but instead multiple approaches based on different technologies and application fields will emerge simultaneously. It is not yet possible to predict when we will reach the point of having a viable quantum computer. The quantum leap has not happened yet.[jom1]  Robustness and stability are essential requirements to ensure that a quantum computer is suitable for as wide a range of applications as possible. The potential is huge, however there is a lack of stable hardware and reliable software, plus the fact that the algorithms to allow precision use of the quantum computer still need to be written.          

    The current state of research and the special features of quantum computers

    A quantum computer works by applying the principles of quantum mechanics microelectronically to solve complex mathematical problems based on the ambivalence of quantum physics. These problems are either not solvable for today’s most powerful supercomputers – for instance the Japanese Fugaku, which has almost half a million teraflops of processing power – or they would need an inordinately long time to do so. Quantum computers can also find solutions that have so far remained inaccessible to us despite the high power of classic computers, and this will be the start of a new dimension of digitalisation. Processing power increases exponentially with the number of qubits.

    What are Qubits?

    Qubit is the common abbreviation for quantum bit, the fundamental information unit of a quantum computer. Classic computers are based on the bit, which can assume just two states (0 or 1). A qubit, which can be made from an atom or photon for instance, can assume not just 0 and 1, but simultaneously every state that is a vector of the number 1, in other words a superposition.

    The development of quantum computers has been happening for some time in tech labs in the USA. The major corporations based there, like Microsoft, Amazon, Google, Apple, Meta (Facebook), IBM and Intel, are supporting various projects with the objective of creating a quantum computer that functions flawlessly. The research race is in full swing: private businesses – but also many academic establishments such as the Massachusetts Institute of Technology (MIT) – are currently developing increasingly powerful computers, which are already able to tap into huge potential on a scale of hundreds of qubits. Numerous start-ups amply funded with venture capital are working on the quantum technology challenge too. At the end of 2021 the new “Eagle” quantum chip was heralded by IBM as the first quantum processor in the world with a total of 127 qubits. This would make the IBM quantum computer an important milestone on the pathway towards the practical application of quantum technology. The fact is, its processing power exceeds that of classic supercomputers by a factor of a million. IBM is already planning “System Two” as an infrastructure for new and more powerful processors, with around 300 qubits. The Forschungszentrum Jülich (Jülich Research Centre; FZJ) offers an example of European approaches to quantum computing. They created a new-generation quantum computer as a pioneering project here, entitled Jupsi (Juelich Pioneer for Spin Interference). The scientists are already discussing the technical potential of building a quantum computer with several million qubits. Visions of a future that will be shaped by such powerful computers lie beyond the bounds of our imagination.

    The People’s Republic of China and human capital

    The USA’s advantage over China has been shrinking recently. This is apparent from the thriving presence of the Chinese internet industry. The “Big Three”, Baidu, Alibaba and Tencent (BAT), have increasingly been investing in research relating to quantum technology of this nature, and are constantly on the look-out for innovative minds. Alongside these and other large-scale business investors, fully or partially state-funded institutions are also working hard to recruit the right kind of talent within the home market, as well as overseas in the Western world. For instance spin-offs of Chinese think-tanks, innovation hubs, accelerators and incubators are becoming established. In this respect a clearly defined focus and rigorous implementation policy are emerging as one of China’s comparative strengths:
    The “High-Level Talent Recruitment Program”, also known as the “Thousand Talents Plan” (Qianren jihua 千人计划), started back in late 2008 against the backdrop of the global financial crisis. The idea was to recruit leading international experts systematically, and at the same time exert influence overseas to encourage the top Chinese scientists educated at Western elite universities to return to their home country. These measures imposed by the government and the favourable proposals of being able to find good jobs in their native country seem to be successful: plenty of tech talents, especially in the fields of artificial intelligence, machine learning, software development and quantum computing, are coming to China. This trend cannot be overlooked: according to the Chinese Returnees from Overseas Study, over 70 per cent of Chinese undergraduates and researchers who had relocated overseas are now returning. In line with the proverbial government directive “Picking flowers in foreign lands to make honey in China” (Yiguo caihua, Zhonghua niangmi 异国采花,中华酿蜜), China’s government specifically encourages the acquisition of intellectual property for the purpose of strategic advantage. The expertise of the returnees helps China. And that befits a strategy designed to achieve global superiority in the application and creation of artificial intelligence (AI), in which quantum technology is a key element.

    Political strategy

    In China quantum technologies have been a focus of political strategies for a long time. This is also clear from the rigorous government planning: in the current 14th five-year plan (2021–2025) they announce “major breakthroughs”. These are supposed to emerge in technology sectors such as quantum information technology, artificial intelligence, semiconductors and space travel. It is hoped that significant development advances will be achieved in quantum technologies as a result of this systematic brain gain, as well as numerous national innovation projects, training labs and state funding – and that this will put the Chinese tech industry on course to become a world market leader. The intention is to achieve this goal through a concerted strategy involving the state and the economic sector – something difficult to imagine in the West – which would mean providing private and state-operated research institutions with optimum financial, research and marketing conditions.

    How can this goal be achieved? China primarily invests in evidence-based research approaches, scientific publications and strategic patents. While European patent registrations in the quantum technology sector are lagging behind, figures from the US and China are high: a working paper published in 2019 demonstrated that the USA and China hold half of all quantum technology patents registered. This success can be attributed to funding: in the USA it’s primarily the tech mega-corporations that spend money on an aggressive patent policy, whereas in China the state is the main funding provider and these funds mostly benefit research at state institutions and universities. Huge sums are spent on commercialisation of quantum technologies in the field of quantum communication, as well as the development of Quantum Key Distribution (QKD, a quantum cryptography process) and cold-atom interferometry (used in applications such as quantum sensor technology and metrology). Another particular focus of Chinese activities is the military practicability of quantum mechanics processes, which are being closely linked with civil research. Although there is no transparency with regard to the precise numbers, rumours are circulating of an eleven-figure sum in euros, in addition to the ongoing government funding.

    A research centre is being built in Hefei, capital of the Chinese Anhui Province, at a cost of 10 billion euros, which will be a national laboratory for quantum communication technologies. The city of Jinan in Eastern China also wants to build a quantum valley, with the aim of starting up projects worth billions by 2025.

    But it isn’t just state funding, a lot of money is also being channelled into quantum technology research by the Chinese online giants. For instance Alibaba has announced that the company will be investing a proportion of its planned research and development budget, a sum of around 13 billion euros, in the development of quantum computing.

    Quantum Key Distribution, quantum cryptography and the QNet

    Quantum Key Distribution (QKD) is the best-known process in quantum cryptography. The application of quantum-based cryptography makes it possible to transmit unhackable messages. At the moment the roll-out of terrestrial QKD networks in China is the most advanced in the world. China already operates a quantum cable 2000 kilometres long between the cities of Shanghai, Hefei, Jinan and Beijing. As quantum states have a maximum transmission length of around 100 kilometres through fibre-optic cables, messages have to be decrypted and re-encrypted at 32 trusted nodes and relayed to the next point. It was the discovery of quantum repeaters that made the quantum net (QNet) possible in the first place. The attractiveness of intrinsically secure quantum encryption makes its potential interesting not only to the military and governments, but also for a number of commercial applications. Virtual doctor’s appointments and even secret project meetings are already taking place on the QNet.

    The QNet would be able to provide three applications that have not existed so far on the conventional internet: unhackable communication, secure quantum computing in the Cloud and traceless searching on the net. In the next few years China and the USA plan to develop large networks for quantum cryptography, which could become the start of a general QNet. Such ambitious infrastructure projects already include the planning of quantum-ready terrestrial fibre-optics, submarine cables – and in particular communication satellites. 

    Quantum satellites and quantum computers

    In August 2016, China launched the quantum-based satellite “Micius”. The satellite, which was named after Mozi 墨子, a philosopher from the Warring States period (480–221 BC), was the starting point for the first successful transmission of a quantum key and the encrypted communication based on it. The project leader, Chinese physicist Pan Jianwei 潘建伟, conducted a quantum-encrypted video chat with his former doctoral supervisor, Anton Zeilinger, who was in Vienna at the time. The distance between the two quantum physicists was around 8,000 kilometres. So they were successful in carrying out a key exchange via satellite between China and Austria, thereby creating a secure communication channel. The satellite acted as a trusted node here. The video conference encrypted in this way using QKD was then held over a standard internet connection. China plans to establish a blanket QNet by 2030.

    Pan Jianwei is the leading quantum scientist in the country. He is referred to as “China’s Einstein” and ranks amongst the Chinese scientists who have been educated overseas. Pan completed his PhD in Vienna under Anton Zeilinger, one of the most highly reputed quantum scientists in the world. Pan founded a research group in Heidelberg and then returned to China, where he is now known as the “father of quantum”. He is so highly regarded that his laboratory at the University of Science and Technology of China (USTC) in the city of Hefei is visited from time to time by President Xi Jinping. Pan’s goal is to establish a long-distance, high-speed quantum communication system that will be compatible with classic communication technology and is up to ten billion times faster than Sycamore, Google’s quantum computer built in 2019.

    In May 2021 Pan Jianwei’s research team developed a programmable superconducting quantum processor with 62 qubits and called it Zu Chongzhi 祖沖之 after a well-known 5th century Chinese mathematician and astronomer. The system's core objective is to synchronously increase the number of integrated qubits and Improve the performance of superconducting qubits, so as to achieve exponential acceleration in the processing speed of specific problems, and finally apply it in practice.

    Research on quantum computers is proceeding full steam ahead: a few months after the initial Zu Chongzhi version, a follow-up with 66 qubits was launched by Pan Jianwei’s team in cooperation with the Shanghai Institute of Technical Physics (Chinese Academy of Sciences). The additional four qubits can achieve an improvement to the processing power in terms of both quality and quantity. This means that the Zu Chongzhi 2.0 is ten million times faster than the fastest regular supercomputer and a million times more powerful than the superconducting quantum computer Sycamore made by Google. But another Chinese quantum computer, Nine Chapter 2, deserves a mention here as well: launched at the end of 2021, it is a product of the research team from Hefei, Shanghai and Wuxi. Its processing speed is tailored to handle specific problems and is one hundred quadrillion (1017) times faster than the regular supercomputer. So after the USA this makes China the second country in the world to achieve quantum primacy.

    Quantum primacy: a foundation for the future

    Since quantum technology is a basis technology, the speed of the future technical revolution is very heavily dependent on competence in this area. The European industry is already painfully aware of this in the context of the current dependence on imported hi-tech from Asia, for example computer chips, pharmaceutical chemicals and pre-products used in the manufacture of other hi-tech goods (for example batteries).  The technological superiority over the conventional production processes used up to now is also critical for the future political sovereignty of Europe. Although this is now a subject of discussion in politics and the media, the acquisition of the necessary technical skills has still not progressed very far. It’s also questionable whether the European consumers are prepared to pay the unavoidably higher prices – for example for mobile devices, network technology, automotive and entertainment electronics. It’s certainly true to say that Europe has missed the boat with the digitalisation trend.

    The global mega-corporations nowadays consist exclusively of technology and internet-based business models. They left the “old industry” giants like big oil, big steel and aerospace behind long ago. The stock market values of these tech heavyweights are now higher than the gross domestic product of the major industrial countries. The statistics alone show the huge financial power of this new quantum technology, which until now has mainly been developed by internet giants. Thanks to a clever policy of promoting the home-grown mega-corporations, China can easily keep up with the big boys here. Europe on the other hand is outclassed. Even though China’s Communist Party is now taking extreme action against some corporations like Alibaba and Didi, the People’s Republic still hasn’t worked out a concept to stop monopolies forming in the internet industry. Europe’s dependency is on the USA in terms of the e-commerce economy – but on China for the production of high-end technological goods that define our present and our future. If Europe hopes to be a part of the immense value creation scheme involving these products and services for consumers and industry, there is no alternative but to step up efforts in the European quantum technology sector. The only way for Europe to survive as an influencer of technology perspectives in this key field is through prudent liberation from China’s domination.

    The development and possible applications of quantum technology are still in the early stages. The reason its potential seems so magical is because in this science of elementary particles the boundaries of time and space become blurred. The opportunities for disruptive innovations that are opening up in view of the rapidly occurring changes in this field lie outside the confines of empirical forecasts. Due to the fundamental nature of quantum mechanical effects, all areas of life will be affected by radical innovation – similar to the way things happened after the introduction of digitalisation and the internet. Complicated processes in particular are then subject to algorithmic processing. These include medical histories, preparation of legal submissions, design solutions in technology and architecture, control and direction functions in logistics and public transport, and finally planning, calculation and implementation of political and military strategies. No cultural traditions, social categories or professional groups will escape the potential disruptions.

    But to enable the ambivalent magic surrounding this new beginning, huge efforts will be required in terms of technology and science. Even the possibility of fully unhackable quantum cryptography and a quantum net based on this assumes technological capabilities that only a few nations and societies possess: a cutting-edge science and research community, a high-end electronics industry and engineering developed to an equally advanced level. The step from lab-based research to the robust practical application of quantum entanglement can only be performed with immense financial commitment and transfer of knowledge. Compared with the development of standard computers, we’re currently at a level approximately equivalent to 1975. Development of a fully error-corrected and universally usable quantum computer remains a great challenge in the immediate future.

    The global quantum technology race is getting harder each day and the uncertainty about the future of society is becoming increasingly important. In this context many questions remain open: what is this enormous computing power used for? Will the speed of research cause our lives to look totally different in the near future? Where will development take us in the field of quantum computing? Will the divide between the leading nations in this area and the rest of the world widen even more?
    So the winner of the race for the Great Leap into the age of quantum computing is still open; however one thing’s certain – the new technologies have the potential to change the future of the world we live in massively.

    Further Reading

    Kagermann, H./Süssenguth, F./Körner, J./Liepold, A. (2020): Innovationspotenziale der Quantentechnologien der zweiten Generation / The Innovation Potential of Second-generation Quantum Technologies; (acatech IMPULS), Munich. 
    Mainzer, Klaus (2020): Quantencomputer. Von der Quantenwelt zur Künstlichen Intelligenz (Quantum computers. From the quantum world to artificial intelligence); Springer Nature, Berlin.
    Meier Christian J. (2021): Eine kurze Geschichte vom Quantencomputer (A short history of quantum computers); (TELEPOLIS), Heidelberg 2021.
    Patel N.V. (2020): China: Überholmanöver bei der Quantenkryptographie (China: an overtaking manoeuvre in quantum cryptography), in: Technology Review. The magazine for innovation. 
    Zhang Q./Xu F./ Li L.`/Liu N.L. (2019): Quantum Information Research in China, in: Quantum Science and Technology 4, 40503.

    Sat, 04 Jun 2022 01:14:00 -0500 en text/html
    000-771 exam dump and training guide direct download
    Training Exams List