Valid as of today 000-276 braindumps that never go wrong.

Killexams.com gives you the legitimate, Latest, and 2022 refreshed IBM Business Process Manager Express or Standard Edition V8.0 BPM Application Development real questions and gave a 100 percent Guarantee. Anyway, 24 hours practice with VCE test system is required. Simply download 000-276 real questions and Actual Questions from your download segment and begin rehearsing. It will simply require 24 hours to prepare you for a genuine 000-276 test.

Exam Code: 000-276 Practice exam 2022 by Killexams.com team
IBM Business Process Manager Express or Standard Edition V8.0 BPM Application Development
IBM Application answers
Killexams : IBM Application answers - BingNews https://killexams.com/pass4sure/exam-detail/000-276 Search results Killexams : IBM Application answers - BingNews https://killexams.com/pass4sure/exam-detail/000-276 https://killexams.com/exam_list/IBM Killexams : I'm BCG's head of talent. Here are 3 qualities all top candidates have — and the one tricky question I like to ask.
  • Amber Grewal is a managing director, partner, and head of global talent at Boston Consulting Group.
  • When considering candidates, she looks for authenticity, an ability to adapt, and strong values.
  • She says interviewing is not about a right or wrong answer — it's about creativity and innovation.

This as-told-to essay is based on a conversation with Amber Grewal, a managing director, partner, and the head of global talent at Boston Consulting Group in San Francisco, California. It has been edited for length and clarity.

My career focus has always been on people — teaching skills, developing new operating models, and building up global hiring-related priorities like recruiting and training. Before BCG, I was the chief talent officer at Intel, supporting their talent strategy and management, and before that, I was the corporate VP of global talent acquisition at IBM. Over the last 20 years, I've held several senior leadership positions at GE and Microsoft, among other companies.

I joined BCG as a lateral hire from my CTO role at Intel. Transitioning to BCG offered me the opportunity to transform our global recruiting organization in a first-of-its-kind role. It's a great honor for me to lead a function that's dedicated to attracting BCG's greatest asset: our people.

As we grow, we're constantly looking for the best and brightest talent across industries, skill sets, and experience levels. For us, it comes down to a combination of attitude and skills.

We receive more than one million applications a year and only 1% of them make the cut. The competition is fierce, and we have high expectations from candidates. Here's what it takes to get a job at BCG.

The top 3 things we look for in a job candidate are: authenticity and honesty, the ability to learn and adapt, and strong values

The best way for someone to demonstrate how they'd succeed at BCG is by sharing concrete examples of the impact they've made in their previous roles or projects. BCGers are inherently curious people, tenacious problem solvers comfortable with ambiguity, open to new and different points of view, and constantly on the quest to learn.

The easiest and most effective way to show curiosity is to ask questions. Good questions can leave a lasting impression on the interviewer. A good question to ask is one that's backed up by your research on the company, about the work you're expected to do, or about potential growth opportunities that demonstrate your ambition.

For consulting roles, we check problem-solving skills through our casework round, but it's important to showcase your aptitude by sharing concrete examples from past work experience or even from your personal life, clearly outlining your approach and the outcome.

Quantifying accomplishments and experience to demonstrate measurable results is the key to standing out in your application and interview

Making your answers to interview questions, your resume, and your cover letter more results-oriented — highlighting outcomes of the projects you've worked on and not just listing tasks — is one way to go about it.

For example, if you helped a team grow their business in your previous role, it's important to quantify the impact by saying that you "supported X% revenue growth."

The most important question we want candidates to reflect on is: 'Why BCG, and why this role?'

What inspires a candidate to join BCG in a particular role — and what they expect out of it — is important information that we as interviewers want to know. This is especially important to understand that the applicant is not applying for any open role, but has a focus and a fair idea of the direction they want to take in the future.

For example, it can be how they see themselves specifically contributing their experience or expertise to a particular role, or what project or aspect of BCG's purpose they connect to and why.

When interviewing candidates, it's not about a right or wrong answer. Instead, I like to look at someone's creativity and innovation.

For example, I may ask, "How would you go about making a built-in coffee maker in a car?" When I ask questions like this, which may seem random and throw the candidate off, I'm looking forward to the approach the interviewee will take. It helps me gauge if the interviewee has a structured thought process, how they behave in an unexpected situation, and if they can look at a problem or situation from a 360-degree lens.

If I could give one piece of advice to someone interviewing at BCG, it would be to be true to yourself — that's your biggest competitive advantage.

Tue, 11 Oct 2022 23:00:00 -0500 en-US text/html https://www.businessinsider.com/head-talent-boston-consulting-group-heres-what-look-for-2022-10
Killexams : IBM swallows Red Hat storage products

IBM is deepening its assimilation of Red Hat by adopting two of its open source storage products and moving some Red Hat staff to IBM.

Red Hat is transferring its storage portfolio and associated teams – including Red Hat Ceph Storage, Red Hat OpenShift Data Foundation (ODF), Rook, and NooBaa – to IBM. The foundation for Spectrum Fusion will become ODF. Customers will see no difference when buying software.

IBM storage GM Denis Kennelly said: “By bringing together the teams and integrating our products under one roof, we are accelerating the IBM’s hybrid cloud storage strategy while maintaining commitments to Red Hat customers and the open-source community.

“I believe this creates the most complete and powerful software-defined storage portfolio in the industry.“

Red Hat’s VP of hybrid platforms, Joe Fernandes, added: “With IBM Storage taking stewardship of Ceph, and OpenShift Data Foundation, IBM will help accelerate open-source storage innovation for Red Hat and IBM customers and expand the market opportunity beyond what each of us could deliver on our own.”

Red Hat’s ODF is a cloud-native storage, data management, and data protection combination based on Ceph, Noobaa, and Rook software components. Ceph provides object, block, and file storage. Noobaa, acquired by Red Hat in 2018, abstracts storage infrastructure across hybrid multi-cloud environments, and provides data storage service management.  Rook orchestrates multiple storage services, each with a Kubernetes operator, and can be used to set up a Ceph cluster. 

IBM’s storage software product line has an overall Spectrum brand. Examples are:

  • Spectrum Elastic Storage System (ESS) –software-defined storage for AI and big data
  • Spectrum Discover – file cataloging and indexing product
  • Spectrum Fusion – containerized derivative of Spectrum Scale plus Spectrum Protect data protection
  • Spectrum Protect – data protection
  • Spectrum Scale – scale-out, parallel file system software
  • Spectrum Virtualize – operating, management, and virtualization software for the Storwize and FlashSystem arrays and SAN Volume Controller
  • Spectrum Virtualize for Public Cloud (SVPC) – available for the IBM public cloud, AWS, and Azure

Ceph is not becoming Spectrum Ceph. Along with ODF, it will remain 100 percent open source. IBM said it will assume Premier Sponsorship of the Ceph Foundation, whose members collaborate on the Ceph open source project.

Overall, IBM says its clients will have access to a consistent set of hardware-agnostic storage services with data resilience, security, and governance across bare metal, virtualized, and containerized environments. 

They should get a unified storage experience for containerized apps running on Red Hat OpenShift. They can use Spectrum Fusion (now with Red Hat OpenShift Data Foundation) to get performance, scale, automation, data protection, and data security for production applications running on OpenShift that require block, file, and/or object access to data. 

IBM says customers should be able to build in the cloud and then deploy on-premises with automation. They will be able to move developed applications from the cloud to on-premises services, automate the creation of staging environments to test deployment procedures, validate configuration changes, database schema and data updates, and ready package updates. 

Kennelly said: “From edge-to-core-to-cloud, software-defined storage is the answer you’ve been looking for, and it’s now here from IBM.” 

This move of software and staff (associates in IBM-speak), characterized by IBM as a partial and non-cash acquisition, should complete by January 1. From then on Red Hat OpenShift Platform Plus will continue to include OpenShift Data Foundation, and be sold by Red Hat and its partners. New Red Hat OpenStack customers will still be able to buy Red Hat Ceph Storage from Red Hat and its partners. Existing OpenShift and OpenStack subscription customers will see no change in their Red Hat relationship. 

IBM said forthcoming Ceph and Spectrum Fusion storage offerings based on Ceph are expected to ship from the first half of 2023. 

Tue, 04 Oct 2022 01:00:00 -0500 Chris Mellor en-US text/html https://blocksandfiles.com/2022/10/04/ibm-red-hat-storage/
Killexams : International Business Machines Corp

52 week range

114.56 - 144.73

  • Open121.80
  • Day High122.88
  • Day Low121.43
  • Prev Close120.04
  • 52 Week High144.73
  • 52 Week High Date06/06/22
  • 52 Week Low114.56
  • 52 Week Low Date11/26/21
  • Market Cap109.754B
  • Shares Out903.18M
  • 10 Day Average Volume0.00M
  • Dividend6.58
  • Dividend Yield5.41%
  • Beta0.83
  • YTD % Change-9.08

KEY STATS

  • Open121.80
  • Day High122.88
  • Day Low121.43
  • Prev Close120.04
  • 52 Week High144.73
  • 52 Week High Date06/06/22
  • 52 Week Low114.56
  • 52 Week Low Date11/26/21
  • Market Cap109.754B
  • Shares Out903.18M
  • 10 Day Average Volume0.00M
  • Dividend6.58
  • Dividend Yield5.41%
  • Beta0.83
  • YTD % Change-9.08

RATIOS/PROFITABILITY

  • EPS (TTM)6.15
  • P/E (TTM)19.76
  • Fwd P/E (NTM)12.46
  • EBITDA (TTM)11.935B
  • ROE (TTM)27.73%
  • Revenue (TTM)59.677B
  • Gross Margin (TTM)54.01%
  • Net Margin (TTM)9.61%
  • Debt To Equity (MRQ)259.21%

EVENTS

  • Earnings Date10/19/2022
  • Ex Div Date08/09/2022
  • Div Amount1.65
  • Split Date-
  • Split Factor-

Thu, 13 Oct 2022 11:59:00 -0500 en text/html https://www.cnbc.com/quotes/IBM
Killexams : As NIST Prepares For Quantum Safe Security, IBM Rolls Out Support

The world of cryptography moves at a very slow, but steady pace. New cryptography standards must be vetted over an extended period and therefore new threats to existing standards need to be judged by decades-long timelines because updating crypto standards is a multiyear journey. Quantum computing is an important threat looming on the horizon. Quantum computers can solve many equations simultaneously, and based on Shor’s Algorithm, crypto experts estimate that they will be able to crack asymmetric encryption. In addition, Grover’s algorithm provides a quadratic reduction in decryption time of symmetric encryption. And the question these same crypto experts try to answer is not if this will happen, but when.

Today’s crypto algorithms use mathematical problems such as factorization of large numbers to protect data. With fault-tolerant quantum computers, factorization can be solved in theory in just a few hours using Shor’s algorithm. This same capability also compromises cryptographic methods based on the difficulty of solving the discrete logarithm problems.

The term used to describe these new, sturdier crypto standards is “quantum safe.” The challenge is we don’t know exactly when fault-tolerant quantum computers will have the power to consistently break existing encryption standards, which are now in wide use. There’s also a concern that some parties could download and store encrypted data for decryption later, when suitably capable quantum computers are available. Even if the data is over ten years old, there still could be relevant confidential information in the stored data. Think state secrets, financial and securities records and transactions, health records, or even private or classified communications between public and/or government figures.

U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) believes it’s possible that RSA2048 encryption can be cracked by 2035. Other U.S. government agencies and other security-minded entities have similar timelines. Rather than wait for the last minute to upgrade security, NIST started a competition to develop quantum-safe encryption back in 2016. After several rounds of reviews, on July 5th of this year, NIST chose four algorithms for the final stages of review before setting the standard. IBM developed three of them, two of those are supported in IBM’s Z16 mainframe today.

NISTNIST Announces First Four Quantum-Resistant Cryptographic Algorithms

The new IBM crypto algorithms are based on a family of math problems called structured lattices. Lattice problems have a unique characteristic that will make it reasonably difficult to solve with quantum computing. Structured lattice problems require solving for two unknowns – a multiplier array and an offset and is extremely difficult for quantum computing to solve the lattice problems. The shortest vector problem (SVP) and the closest vector problem (CVP) – upon which lattice cryptography is built – is considered extremely difficult to a quantum computer to solve. Each candidate crypto algorithm is evaluated not just for data security, but also for performance - the overhead cannot be too large for wide spread use.

The final selections are expected in 2024, but there’s still a chance there will be changes before the final standards are released.

MORE FROM FORBESIBM Lattice Cryptography Is Needed Now To Defend Against Quantum Computing Future

IBM Supports Quantum Safe in New Z-Series Mainframes

IBM made a strategic bet before the final NIST selections. The recently released IBM Z16 Series computers already support two of the final four quantum safe crypto candidates: the CRYSTALS-Kyber public-key encryption and the CRYSTALS-Dilithium digital signature algorithms. IBM is set to work with the industry to substantiate these algorithms in production systems. Initially, IBM is using its tape drive storage systems as a test platform. Because tape is often used for cold storage, it's an excellent medium for long-term data protection. IBM is working with its client base to find the appropriate way to roll out quantum-safe encryption to the market. This must be approached as a life cycle transformation. And, in fact, IBM is working with its customers to create a crypto-agile solution, which allows the exact crypto algorithm to change at any point in time without disrupting the entire system. It’s not just a rip and replace process. With crypto-agility, the algorithm is abstracted from the system software stack so a new algorithms can be deployed seamlessly. IBM is developing tools making crypto status part of the overall observability with a suitable dashboard to see crypto events, etc.

These new algorithms must be deployable to existing computing platforms, even at the edge. However, it's not going to feasible to upgrade every system; it’s probably going to be an industry-by-industry effort and industry consortia will be required. For example, IBM, GSMA (Global System for Mobile Communication Association), and Vodafone recently announced they will work via a GSMA Task Force to identify a process to implement quantum-safe technologies across critical telecommunications infrastructure, including the networks underpinning internet access and public utility management. The telecommunication network carries financial data, health information, public-sector infrastructure systems, and sensitive business data which needs to be protected as it traverses global networks.

IBM Research BlogHow IBM is helping make the world's networks quantum safe | IBM Research Blog

What’s Next for Quantum Safe Algorithms

Fault-tolerant quantum computing is coming. When it will be available is still a guessing game, but the people who most care about data security are targeting 2035 to have quantum-safe cryptographic algorithms in place to meet the threat. But that’s not good enough. We need to start protecting critical data and infrastructure sooner than that, considering the length of time systems are deployed in the field and data is stored. Systems such as satellites and power stations are not easy to update in the field.

And there’s data that must be stored securely for future retrieval, including HIPAA (for medical applications), tax records, toxic substance control act and clinical trial data, and others.

Even after the deployment of these new algorithms, this is not the end – there may still be developments that can break even the next generation quantum-safe algorithms. The struggle between those that want to keep systems and data safe and those that want to crack them continues and why companies should look to building in crypto agility into their security plans.

Tirias Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud. Members of the Tirias Research team have consulted for IBM and other companies throughout the Security, AI and Quantum ecosystems.

Fri, 07 Oct 2022 11:36:00 -0500 Kevin Krewell en text/html https://www.forbes.com/sites/tiriasresearch/2022/10/07/as-nist-prepares-for-quantum-safe-security-ibm-rolls-out-support/
Killexams : How To Turn An AI Idea Into A Real Product

Oleg Lola, founder and CEO at MobiDev, a custom software development company.

Four years ago, Gartner predicted that by 2022, 85% of AI projects would fail to deliver tangible outcomes. But eventually, according to the IBM Global AI Adoption Index, around 66% of tech companies either execute or plan to apply AI today. This means the market is still growing, and there is no other way to stay competitive but to adopt artificial intelligence.

The prosperity of AI products, in turn, makes it easier for new applications to grow by producing technical resources. However, publicly available assets don’t make it any easier to adopt. As Gartner reports, only 53% of AI projects go from prototypes to production.

Technical complexity might seem an obvious reason for an initiative to fail, but that’s not always the case. As in my observation, the majority of AI product ideas die before any development has even started. This has to deal with the approach entrepreneurs try to apply and misconceptions businesses have about technology in general.

Where should an AI project start?

Any application has front-end and backend components, with some business logic laid underneath. That’s what basically dictates the terms when we plan a roadmap for future development by setting milestones, goals and feature lists.

Obviously, AI applications don’t exist in a vacuum without an interface or backend infrastructure to make the whole thing run. Whether you’re about to glue some functionality to the existing app or build a whole new product, you’ll definitely need to develop working software around AI. But does it mean we can treat AI projects like traditional ones? The short answer: Definitely not, and the reason why is that there is much more uncertainty.

The idea of an AI product is formed over the data and how it can be used to achieve certain results. But even at the point of data understanding, and what potential model can be used, it is still impossible to predict if these things actually work together. When it comes to machine learning, the same model may suit one task, but completely fail at another similar task. This makes early-stage planning completely useless both for the AI part and the usual software, as we can’t be sure the idea is viable. So where should we start?

Embrace proof-of-concept.

The worst-case scenario is the one where you ignore the uncertainty factor and build your plan over too much uncertainty. Over the years, we’ve developed a solid understanding that a proof-of-concept, or POC approach, is the only way to dispel doubts around customers’ AI project ideas. POC is a collaborative experiment where we try to realize our assumptions into something working.

Ultimately, starting with POC helps to set the borders and understand the requirements for the project in the short term. Concerning the budget, POC is also beneficial for your project, as it clarifies the initial scope of work. Which allows us to estimate the time and resources needed. And here, your chances to go for a pilot increase much higher.

However, there is one potential pitfall to know about. Any AI project is still a scientific project that requires tons of experiments to make things work. As we strive to build an application that will work in production, we need to search for proof of concept in conditions as close as possible to real life. This means you want to avoid running POC with completely odd data in a laboratory environment.

Understand the innovativeness of your project.

The next thing to pay attention to is the innovativeness of your product idea. Among our customers, there is a clear trend of having wrong expectations on the project development time, budget and resources required. In the ideal world, a well-known technology should be easy to adopt. But in reality, any AI application case requires some time to investigate the details.

During the last 10 years, the world has seen thousands of AI applications, with over 8 thousand active startups in 2022. Thanks to the tech community, many of the resources stay available and are open to the public. Namely, there are numerous open-source models designed for different AI tasks and datasets collected for a specific type of AI research.

Depending on the innovativeness of your project, all of those open source assets may or may not be used to some degree. The relationship here is pretty linear.

• The newer the idea, the fewer data and pretrained models are available out of the box. This means additional effort for data collection, understanding, labeling and real model training.

• The more straightforward your idea is, the richer open-source assets are available for any stage of your project. For example, tasks like object detection in computer vision will require less preparation because there are a plethora of pretrained models and tons of data available.

The earlier you understand the general terms, the sooner you can set the right expectations on the project’s duration and complexity.

Don’t chase early estimates.

The absence of clear estimates is a huge pain factor for any entrepreneur. But in the case of AI projects, it is fairly difficult to provide numbers and deadlines early on, especially before any investigation of the idea is done. That’s where POC iterations come in handy.

Any estimates should be formed based on the technical feedback that is produced during the elaboration phase. For a business person, the best way is to trust the technical feedback of your team and follow along. Data science experts and machine learning engineers are much more aware of the risks and can form rough orders of magnitude and further estimates if you allow them to figure out the project requirements in a real experiment.

But as a bigger part of advice, you also need to treat your AI project as your business strategy to ensure proper resource allocation and gradual investment in the development.


Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?


Wed, 05 Oct 2022 01:45:00 -0500 Oleg Lola en text/html https://www.forbes.com/sites/forbestechcouncil/2022/10/05/how-to-turn-an-ai-idea-into-a-real-product/
Killexams : Universities Make North Carolina A Hub for Quantum Computing (TNS) — In the 1950s, computers were bulky, inefficient and limited. They ate up entire rooms but couldn’t go beyond rudimentary calculations.

As you know, these machines didn’t stay simple; the mid-20th century computer modernized, compacted, and went on to change the world. This is the path many believe quantum computers are now on: elementary today — transformative tomorr... well, we’ll see.

The promise of computers based on subatomic physics is tantalizing. In theory, problems that would take classical computers years to solve could be handled by quantum computers in minutes, bursting open advancements in finance, chemistry, artificial intelligence, logistics, cybersecurity and more.


With exponentially enhanced calculating power, scientists may have the tools to discover new medicines. Financial firms could better optimize portfolios. Companies would route supply chains more efficiently, and meteorologists would grow more accurate. Hackers might use quantum’s power to bypass passwords but companies and nations could counter by deploying quantum computing to strengthen their cyber defenses.

The possibilities of what quantum could accomplish are vast and hard to pinpoint. Researchers don’t know when a real-world quantum breakthrough will occur, but many do say “when,” not “if.”

“Quantum is progressing faster than many people are anticipating,” said Eric Ghysels, a finance and economics professor at Univerity of North Carolina-Chapel Hill. “This thing is coming, and you better be prepared.”


Governments, businesses and universities worldwide are spending heavily to prepare for quantum. And in the past few years, the three corners of North Carolina's Research Triangle — Duke University, North Carolina State University, and UNC — have each made distinct contributions to this emerging field, turning the state into a legitimate quantum hot spot.

THE POWER AND THE NOISE

In 2018, IBM picked N.C. State’s Centennial Campus as the site of its first IBM Quantum Hub in North America.

Two years later, Duke partnered with the Maryland-based company IonQ to open the Duke Quantum Center inside downtown Durham’s Chesterfield building. Under their arrangement, IonQ has exclusive rights to the intellectual property the lab produces while Duke has received equity in the public company.

IBM and IonQ — and by extension N.C. State and Duke — are racing toward a common goal: to achieve what’s known as “quantum advantage,” the still-elusive moment when a quantum computer can perform a real-world task better than a classical computer. (The term “quantum supremacy” refers to a moment when a quantum computer achieves something a classical computer could never accomplish.)

But chasing quantum advantages is where similarities between the two facilities end, said Chris Monroe, cofounder of IonQ and the director of the Duke Quantum Center. “IBM’s approach and our approach couldn’t be more different,” he said.

To understand their differences, it helps to understand some of quantum’s underlying science.

Quantum computers reflect the physics of the subatomic world to manage information. While classical computers run on bits represented by digital 1s and 0s, quantum computers use quantum bits, called qubits, to display microscopic states in a much more complex manner.

A pair of quantum mechanical phenomena make these machines exponentially more advanced. The first is called superposition — the capacity of a qubit to be in multiple positions at once until it’s measured. The second is entanglement, which is how different qubits are interwoven.

All this can be quite confusing to the layman, and even to other scientists, quantum researchers acknowledge.

“The laws of microscopic physics look very, very different from what you and I experience on a normal day,” said Patrick Dreher, the chief scientist at N.C. State’s IBM Quantum Hub.

Instead of single answers, quantum computers spit out probability distributions. For example, they wouldn’t say 2 + 3 = 5 but would answer with a range of probabilities peaking around 5. This is one of the reasons researchers say quantum computers will augment, but never fully replace, digital Macs or PCs. Quantum machines could handle massive calculations, but quotidian tasks like Microsoft Word, basic mathematics and streaming videos may always be best served by classical computers.

So, what’s keeping quantum computers from reaching their potential? There are several hurdles.

As one might expect, the subatomic realm is difficult to control. Atoms naturally bounce around, which can cause contamination that leads to “noisy” results. The microscopic interactions computers must capture occur incredibly quickly, requiring extreme precision, and when errors arise, attempts to correct one qubit can easily interfere with other qubits.

“It’s a fragile machine,” Dreher said. “And because they are noisy, they have a limited ability to keep doing computations forever.”

Despite their limitations, quantum computers have evolved from theory to tangible, functioning machines. Researchers have named this current stage the Noisy Intermediate-Scale Quantum, or NISQ, era. There are several intermediate-scale, noisy computers running today, with Dreher saying no one knows which will be “the home run.”

LASERS, ATOMS AND GOLD CHANDELIERS

Of all the quantum computers in the world, IBM’s are likely the most ornate. Nicknamed “chandeliers,” they feature gold-plated, five-level apparatuses with an orderly progression of tubes and wires running down to single silicon processor chips. At the bottom rung, each chandelier cools a superconducting chip.

And by cool, IBM means really, really frigid.

Quantum researchers attempt to control subatomic activity by creating extreme environments, and the chandelier does this with temperature. At its lowest level, the temperate is .01 degrees Kelvin, making it one of the coldest places in the universe.

IBM operates more than 20 quantum computers around the world, from upstate New York to Japan, and they offer members of their quantum network — like N.C. State — exclusive access to advanced computers which are kept inside a metal silo and behind a glass cube like a museum art piece. As a member of the IBM Quantum Network, N.C. State scientists can run remote experiments on these computers through the cloud.

Duke students, on the other hand, have access to a quantum computer they can touch. The Duke Quantum Center studies a type of computer called ion-trap, which levitates individual atoms above a gold-plated silicon chip in an airless vacuum. Lasers are then shot at the atom to modify the state of the qubits inside the atom and affect how they interact. Chris Monroe compared the process to plucking a guitar string.

The handful of ion-trap computers at Chesterfield are aesthetically less impressive than IBM’s chandelier. They stretch out like a crowded city — vacuum chambers, camera lenses, modulators and lasers intricately huddled together. Monroe touts these machines as the most advanced ion-traps in the world and believes once the engineering obstacles are overcome (in short, it’s very difficult to precisely strike atoms with lasers), IonQ computers can be widely available.

“We envision a future where quantum computers are in people’s pockets,” he said.

UNC TACKLES FINTECH

While N.C. State and Duke focus on quantum research, UNC is a national leader on quantum technologies in finance.

In May 2020, the school started one of the country’s first webinar series on applying quantum to business. To meet the growing demand for quantum, the UNC Kenan-Flagler Business School has adjusted its curriculum to include more about quantum. “The stakes are high,” said Eric Ghysels. “A lot of financial institutions realize they better get started now, even if the science is a few years away.”

The financial industry is well positioned for early quantum applications. First, it’s an analytic-based industry where precise timing and price modeling can mean billions. Many foresee quantum optimizing portfolios and delivering unprecedented account security. Second, with so much money on the line, deep-pocketed companies are making significant investments in the field.

In March, Fidelity entered a partnership with N.C. State’s hub, and IBM itself is interested in the research its hub produces for finance purposes. “In terms of fintech, the Triangle is becoming a considerable force,” Ghysels said. “Companies want to hire here and settle here.”

The Triangle lacks the concentration of quantum companies seen in cities like San Francisco, Boston and New York, but there are signs the commercial side of quantum is burgeoning locally as larger companies like Apple and Google enter the market.

The California quantum computing manufacturer Atom Computing recently based its executive office in Cary, and startups like Dark Star Quantum Lab in Apex seek to find a niche in quantum consulting.

“It makes sense this would be a good place for quantum in terms of applications,” said Dark Star CEO Faisal Shah Khan.

Khan noted the Triangle’s relative proximity to the financial capital of New York City (same time zone, quick flight away) makes it an even more attractive place for quantum and fintech.

'IT WILL FAIL IN SOME INTERESTING WAY'

So when will quantum computers be ready?

“If I knew that, I wouldn’t be here as a professor,” Patrick Dreher said. “I’d be on Wall Street, or I’d be talking to (venture capitalists). You’re asking me in 1949, ‘When are we going to build a digital computer that won’t have vacuum tubes and need a whole room to make it work?’ This is why people win Nobel Prizes.”

In 2019, Google announced it had achieved quantum superiority on a contrived mathematical problem, meaning one that doesn’t relate to a real-world situation. IBM pushed back on Google’s claim, and the quantum superiority debate lingers.

Quantum technology remains in a “pre-competitive stage,” said Dennis Kekas, an associate vice chancellor at N.C. State’s Centennial Campus. By this, Kekas meant companies generally still share their findings in service of scientific advancement. In academia, UNC, Duke and N.C. State host a weekly Triangle Quantum Computing Seminar Series throughout the school year, inviting experts from around the globe.

At Chesterfield, Duke PhD students are on the front lines of quantum research. Their current lab work focuses on getting ion-trap computers to communicate with each other, sharing information between machines like classical computers can do now. In recent years, they have seen colleagues go on to jobs as quantum business consultants, continue in academia, join IonQ, or get brought into national labs like Los Alamos.

Asked about the future of quantum, the students’ perspectives were a reminder that no one knows for sure — not them, not their teachers, not the companies that might hire them — if the quantum dream will ever be realized. They spoke of keeping the long view in mind and noted quantum advantage isn’t likely to be right around the corner.

Jameson O’Reilly, a fourth-year PhD student at the lab, said he believes quantum advantage will eventually be achieved, but if it isn’t, he said it still will have been worth the effort.

“I think that if it doesn’t happen, it will fail in some interesting way,” he said. “In a way that gives us more understanding of the universe.”

©2022 The News & Observer. Distributed at Tribune Content Agency, LLC.

Thu, 06 Oct 2022 05:41:00 -0500 en text/html https://www.govtech.com/education/higher-ed/universities-make-north-carolina-a-hub-for-quantum-computing
Killexams : IBM Redefines Hybrid Cloud Application and Data Storage Adding Red Hat Storage to IBM Offerings

Newly expanded software-defined storage portfolio enables IBM to deliver a consistent experience from edge-to-core-to-cloud

ARMONK, N.Y., Oct. 4, 2022 /PRNewswire/ -- IBM (NYSE: IBM) announced today it will add Red Hat storage products and Red Hat associate teams to the IBM Storage business unit, bringing consistent application and data storage across on-premises infrastructure and cloud.

IBM Corporation logo. (PRNewsfoto/IBM)

With the move, IBM will integrate the storage technologies from Red Hat OpenShift Data Foundation (ODF) as the foundation for IBM Spectrum Fusion. This combines IBM and Red Hat's container storage technologies for data services and helps accelerate IBM's capabilities in the burgeoning Kubernetes platform market.

In addition, IBM intends to offer new Ceph solutions delivering a unified and software defined storage platform that bridges the architectural divide between the data center and cloud providers. This further advances IBM's leadership in the software defined storage and Kubernetes platform markets.

According to Gartner, by 2025, 60% of infrastructure and operations (I&O) leaders will implement at least one of the hybrid cloud storage architectures, which is a significant increase from 20% in 2022.1 IBM's software defined storage strategy is to take a "born in the cloud, for the cloud" approach—unlocking bi-directional application and data mobility based on a shared, secure, and cloud-scale software defined storage foundation.

"Red Hat and IBM have been working closely for many years, and today's announcement enhances our partnership and streamlines our portfolios," said Denis Kennelly, general manager of IBM Storage, IBM Systems. "By bringing together the teams and integrating our products under one roof, we are accelerating the IBM's hybrid cloud storage strategy while maintaining commitments to Red Hat customers and the open-source community."

"Red Hat and IBM have a shared belief in the mission of hybrid cloud-native storage and its potential to help customers transform their applications and data," said Joe Fernandes, vice president of hybrid platforms, Red Hat. "With IBM Storage taking stewardship of Red Hat Ceph Storage and OpenShift Data Foundation, IBM will help accelerate open-source storage innovation and expand the market opportunity beyond what each of us could deliver on our own. We believe this is a clear win for customers who can gain a more comprehensive platform with new hybrid cloud-native storage capabilities."

As customers formulate their hybrid cloud strategies, critical to success is the emphasis and importance of infrastructure consistency, application agility, IT management and flexible consumption consistency as deciding factors to bridge across on-premises and cloud deployments.

With these changes to the IBM portfolio, clients will have access to a consistent set of storage services while preserving data resilience, security, and governance across bare metal, virtualized and containerized environments.  Some of the many benefits of the software defined portfolio available from IBM will include:

  • A unified storage experience for all containerized apps running on Red Hat OpenShift: Customers can use IBM Spectrum Fusion (now with Red Hat OpenShift Data Foundation) to achieve the highest levels of performance, scale, automation, data protection, and data security for production applications running on OpenShift that require block, file, and/or object access to data. This enables development teams to focus on the apps, not the ops, with infrastructure-as-code designed for simplified, automated managing and provisioning.

  • A consistent hybrid cloud experience at enterprise levels of scale and resiliency with IBM Ceph: Customers can deliver their private and hybrid cloud architectures on IBM's unified and software defined storage solution, providing capacity and management features. Capabilities include data protection, disaster recovery, high availability, security, auto-scaling, and self-healing portability, that are not tied to hardware, and travel with the data as it moves between on-premises and cloud environments.

  • A single data lakehouse to aggregate and derive intelligence from unstructured data on IBM Spectrum Scale: Customers can address the challenges that often come with quickly scaling a centralized data approach with a single platform to support data-intensive workloads such as AI/ML, high performance computing, and others. Benefits can include less time and effort to administer, reduced data movement and redundancy, direct access to data for analytics tools, advanced schema management and data governance, all supported by distributed file and object storage engineered to be cost effective.

  • Build in the cloud, deploy on-premises with automation: Customers can move developed applications from the cloud to on-premises services, automate the creation of staging environments to test deployment procedures, validate configuration changes, database schema and data updates, and ready package updates to overcome obstacles in production or correct errors before they become a problem that affects business operations.

"IBM and Red Hat speaking with one voice on storage is delivering the synergies derived from IBM's Red Hat acquisition," said Ashish Nadkarni, group vice president and general manager, Infrastructure Systems at IDC. "The combining of the two storage teams is a win for IT organizations as it brings together the best that both offer: An industry-leading storage systems portfolio meets an industry-leading software-defined data services offering. This initiative enables IBM and Red Hat to streamline their family of offerings, passing the benefits to their customers. It also helps accelerate innovation in storage to solve the data challenges for hybrid cloud, all while maintaining their commitment to open source."

Preserving commitment to Red Hat clients and the community

Under the agreement between IBM and Red Hat, IBM will assume Premier Sponsorship of the Ceph Foundation, whose members collaborate to drive innovation, development, marketing, and community events for the Ceph open-source project. IBM Ceph and Red Hat OpenShift Data Foundation will remain 100% open source and will continue to follow an upstream-first model, reinforcing IBM's commitment to these vital communities. Participation by the Ceph leadership team and other aspects of the open-source project is a key IBM priority to maintain and nurture ongoing Red Hat innovation.

Red Hat and IBM intend to complete the transition by January 1, 2023, which will involve the transfer of storage roadmaps and Red Hat associates to the IBM Storage business unit. Following this date, Red Hat OpenShift Platform Plus will continue to include OpenShift Data Foundation, sold by Red Hat and its partners. Additionally, Red Hat OpenStack customers will still be able to buy Red Hat Ceph Storage from Red Hat and its partners. Red Hat OpenShift and Red Hat OpenStack customers with existing subscriptions will be able to maintain and grow their storage footprints as needed, with no change in their Red Hat relationship.

Forthcoming IBM Ceph and IBM Spectrum Fusion storage solutions based on Ceph are expected to ship beginning in the first half of 2023.

Read more about today's news in this blog from Denis Kennelly, general manager of IBM Storage, IBM Systems: "IBM + Red Hat: Doubling Down on Hybrid Cloud Storage"

Statements regarding IBM's future direction and intent are subject to change or withdrawal without notice and represent goals and objectives only. Red Hat, Ceph, Gluster and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries.

About IBM 
IBM is a leading global hybrid cloud and AI, and business services provider, helping clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 3,800 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently, and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBM's legendary commitment to trust, transparency, responsibility, inclusivity, and service. For more information, visit www.ibm.com for more information.

Media Contacts: 
Ben Stricker, IBM 
ben.stricker@ibm.com

1 Gartner, Market Guide for Hybrid Cloud StorageJulia PalmerKevin JiChandra Mukhyala, 3 October 2022

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/ibm-redefines-hybrid-cloud-application-and-data-storage-adding-red-hat-storage-to-ibm-offerings-301640078.html

SOURCE IBM

Tue, 04 Oct 2022 01:24:00 -0500 en-US text/html https://finance.yahoo.com/news/ibm-redefines-hybrid-cloud-application-130000004.html
Killexams : Revelio Labs raises cash to scrape the public web for HR insights

Revelio Labs, a startup developing analytics software for HR teams, today announced that it raised $15 million in a Series A round with participation from Elephant Partners, Alumni Ventures, BDMI, K20 Ventures, Techstars and Barclays. Bringing the company’s total raised to $19 million, the proceeds will be put toward expanding Revelio’s offerings for corporate HR and strategy, CEO Ben Zweig tells TechCrunch.

Zweig co-launched Revelio with the company’s second co-founder, Yedidya Gorsetman, in early 2018. Prior to Revelio, Zweig was a managing data scientist at IBM in the chief analytics office. Gorsetman came from the film industry, where he produced mostly commercials and feature-length films.

“While at IBM, I had worked on a lot of people analytics projects using internal HR data,” Zweig told TechCrunch in an email interview. “A lot of the work was very interesting, but after three years, I began to notice a frustrating trend. We would have a finding about our workforce — like we were gaining or losing a type of employee — but it was never clear if that was a good thing or a bad thing, because we were only seeing internal data, and we had no way of knowing if the same was true for our competitors.”

Revelio attempts to solve this problem by ingesting and analyzing public employment records to create what Zweig describes as a “universal HR database.” The platform provides metrics on headcounts, as well as inflows and outflows at the occupation, location, seniority, education, gender and ethnicity levels across companies. It also shows trends on job postings, general staff sentiment and layoff notices.

To accomplish this, Revelio scrapes public profiles, resumes and job postings on the web and curates them through the use of in-house algorithms. Zweig claims that Revelio can help answer questions like where talent is being acquired from and why people are joining or leaving a specific company.

“In our taxonomies, we’ve created a mathematical representation of every job title, seniority level, work activity, skill and company. That allows us to automatically adapt to a changing economy, in a much more accurate and simple way,” Zweig explained. “In addition to our job, skill, and activity taxonomies, we’ve also created a company taxonomy mapping to understand [which] companies compete for talent and which compete for products and services.”

Revelio

Image Credits: Revelio

Privacy concerns aside, hiring data can be rife with results-skewing errors and omissions — a fact Zweig readily acknowledges. He claims that Revelio’s internal processes mitigate this to the extent possible, but the company’s own documentation reveals the limitations of its measures, like when the platform recently wasn’t reporting people’s highest degrees. Revelio is also subject to the whims and policies of its data sources; last May, Revelio lost access to user skills for the majority of public profiles and was forced to implement an alternative solution.

In another potential knock against Revelio, the company isn’t the only one that’s canvassed the web for hiring insights. For example, there was Joberate, which looked across social media to attempt to predict the likelihood particular staff will leave; Wilson Human Capital acquired the firm in March.

But Zweig argues that Revelio is one of the few vendors to date to create a standard way to understand HR trends across firms and industries — and to make it easy to access.

“The current setup [of legacy companies] requires huge teams of analysts and expensive infrastructures to see even the most basic changes. But at Revelio, we’re automating ways to easily interpret what’s going on and deliver it in an easy to use experience, so that anybody can access this critical intelligence,” he said.

That may be so — but will HR professionals use it? Surveys suggest that they’re generally reluctant when it comes to new software, no matter the application. According to one from PwC in 2020, more than 80% of companies said that they struggled with HR tech adoption challenges, which the authors blamed on planning phases that failed to get the right stakeholders involved.

In response, Zweig points to Revelio’s over 150 customers, ranging from the U.S. Federal Reserve System to banking institutions like Bank of America, Morgan Stanley and Citi, as well as consulting firms KPMG and EY and academic institutions, including MIT and Harvard Business School. Use cases vary, but they’re substantive. In August, Revelio collaborated with KPMG to compare the employment and sentiment impact of SPACs to traditional IPOs.

“The economy right now is dominated by labor market uncertainty and we’re in the unique position to provide a lot of answers to that uncertainty. We couldn’t have known this at the time, but the pandemic ended up accelerating our growth because suddenly as the world was going through major shifts, everyone started tracking this,” Zweig continued, but declined to reveal firm revenue figures. “With trends like The Great Resignation and ‘quiet quitting,’ the private sector is increasingly turning to workforce intelligence platforms like Revelio in order to identify talent issue-areas and pave out potential solutions.”

New York–based Revelio currently has around 50 employees.

Fri, 07 Oct 2022 03:11:00 -0500 en-US text/html https://techcrunch.com/2022/10/06/revelio-labs-raises-cash-to-scrape-the-public-web-for-hr-insights/
Killexams : IBM Expands Partner Access To Training Resources

Channel programs News

Wade Tyler Millward

“We can‘t be essential unless our partners are skilled in our products and confident in going to their clients with our products and selling them with us and for IBM,” IBM channel chief Kate Woolley said.

 ARTICLE TITLE HERE

IBM has started giving registered members of its PartnerWorld program access to the training, badges and enablement IBM sales employees get along with a new learning hub for accessing materials.

The expansion is part of the Armonk, N.Y.-based tech giant’s investment in its partner program, IBM channel chief Kate Woolley told CRN in an interview.

“We can‘t be essential unless our partners are skilled in our products and confident in going to their clients with our products and selling them with us and for IBM,” said Woolley (pictured), general manager of the IBM ecosystem.

[RELATED: Channel Chief Kate Woolley: ‘No Better Time To Be An IBM Partner’]

Partners now have access to sales and technical badges showing industry expertise, according to a blog post Tuesday. Badges are shareable on LinkedIn and other professional social platforms. IBM sales representatives and partners will receive new content at the same time as it becomes available.

“This is the next step in that journey in terms of making sure that all of our registered partners have access to all of the same training, all of the same enablement materials as IBMers,” Woolley told CRN. “That’s the big message that we want people to hear. And then also in line with continuing to make it easier to do business with IBM, this has all been done through a much improved digital experience in terms of how our partners are able to access and consume.”

Among the materials available to IBM partners are scripts for sales demonstrations, templates for sales presentations and positioning offerings compared to competitors, white papers, analyst reports and solution briefs. Skilling and enablement materials are available through a new learning hub IBM has launched.

“The partners are telling us they want more expertise on their teams in terms of the IBM products that they‘re able to sell and how equipped they are to sell them,” Woolley said. “And as we look at what we’re hearing from clients as well, clients want that. … Our clients are saying, ‘We want more technical expertise. We want more experiential selling. We want IBM’ – and that means the IBM ecosystem as well – ‘to have all of that expertise and to have access to all the right enablement material to be able to engage with us as clients.’”

The company has doubled the number of brand-specialized partner sellers in the ecosystem and increased the number of technical partner sellers by more than 35 percent, according to IBM.

The company’s recent program changes have led to improved deal registration and introduced to partners more than 7,000 potential deals valued at more than $500 million globally, according to IBM. Those numbers are based on IBM sales data from January 2022 to August.

Along with the expanded access to training and enablement resources, Woolley told CRN that another example of aligning the IBM sales force and partners was a single sales kickoff event for employees and partners. A year ago, two separate events were held.

“I want our partners to continue to feel and see this as a big investment in them and representative of how focused we are on the ecosystem and how invested we are,” she said.

Wade Tyler Millward

Wade Tyler Millward is an associate editor covering cloud computing and the channel partner programs of Microsoft, IBM, Red Hat, Oracle, Salesforce, Citrix and other cloud vendors. He can be reached at wmillward@thechannelcompany.com.

Tue, 04 Oct 2022 07:15:00 -0500 en text/html https://www.crn.com/news/channel-programs/ibm-expands-partner-access-to-training-resources
Killexams : IBM merges its data storage offerings with Red Hat’s OpenShift and Ceph

IBM Corp. is making some big changes to its data storage services, announcing today that it will bring Red Hat Inc.’s storage products and associates under the “IBM Storage” umbrella.

The aim, IBM said, is to deliver a more consistent application and data storage experience across on-premises and cloud infrastructures. It’s a big move that will see IBM Spectrum Fusion data management software adopt the storage technologies of Red Hat’s OpenShift Data Foundation as its new base layer.

Even more interesting, perhaps, is that the open-source Red Hat Ceph Storage offering will be transformed into a new IBM Ceph storage offering. IBM said this will result in a unified, software-defined storage platform that’s better able to bridge the architectural divide between data centers and cloud computing providers.

The computing giant said the move is in line with its software-defined storage strategy of a “born in the cloud, for the cloud” approach that will unlock bidirectional application and data mobility based on a shared, secure and cloud-scale solution.

IBM Systems General Manager of Storage Denis Kennelly said the shift is designed to streamline the two companies’ portfolios. “By bringing together the teams and integrating our products under one roof, we are accelerating IBM’s hybrid cloud strategy while maintaining commitments to Red Hat’s customers and the open-source community,” he insisted.

The company presented the changes as a big win for customers, saying they will gain access to a more consistent set of storage services that preserve data resilience, security and governance across bare metal, virtualized and containerized environments. More specifically, IBM is promising that customers will have a more unified storage experience for container-based applications running on Red Hat OpenShift, with the ability to use IBM Spectrum Fusion, which is now based on Red Hat OpenShift Data Foundation. Doing so will provide higher performance, greater scale and more automation for OpenShift applications that require block, file and object access to data, the company said.

As for IBM Ceph, the company said this will deliver a more consistent hybrid cloud experience with enterprise-grade scale and resiliency.

Furthermore, by unifying IBM’s and Red Hat’s storage technologies, customers will be able to build a single data lakehouse on IBM Spectrum Scale to aggregate all of their unstructured data in one place. Benefits will include less time spent on maintenance, reduced data movement and redundancy, and more advanced schema management and data governance.

Industry watchers were united in their belief that the changes would be of benefit to customers. Steve McDowell of Moor Insights & Strategy told SiliconANGLE that today’s move makes a lot of sense because it enables IBM to leverage the storage strengths of both companies.

McDowell explained that although IBM Spectrum is considered to be one of the most comprehensive data management platforms around, its foundation predates the rise of cloud-native technologies. On the other hand, he said, Red Hat OpenShift was built from the ground up to support cloud-native workloads.

“IBM is evolving Spectrum Fusion to take the best of Red Hat’s efforts, and is using Red Hat’s storage software as the base for its IBM-branded products moving forward,” McDowell said. “It makes a lot of business sense for IBM to leverage R&D from Red Hat into its more traditionally proprietary systems. It also gives IBM an easy path to better serve the needs of containerized workloads.”

International Data Corp. analyst Ashish Nadkarni said the two companies are now “speaking with one voice on storage” and finally delivering on the synergies between them that were mentioned when IBM acquired Red Hat in 2019.

“The combining of the two storage teams is a win for IT organizations as it brings together the best that both offer: An industry-leading storage systems portfolio meets an industry-leading software-defined data services offering,” Nadkarni said. “This initiative enables IBM and Red Hat to streamline their family of offerings, passing the benefits to their customers.”

IBM also moved to reassure users of Red Hat’s open-source technologies that it will remain fully committed to them following today’s announcements. As part of the deal, IBM will take over Premier Sponsorship of the Ceph Foundation and, along with Red Hat’s teams, continue to drive innovation and development. Both IBM Ceph and Red Hat OpenShift will remain 100% open-source, the company added, and will continue to follow an upstream-first development model.

McDowell said today’s move would likely make some users nervous about the prospect of Red Hat’s technology becoming more proprietary over time. “IBM has been very careful since it acquired Red Hat in 2019 to keep Red Hat’s open-source business segregated from IBM’s branded offerings,” he said. “This is the first time we’re seeing IBM cross that that line, and it’s natural to wonder how blurred those lines will become.”

Still, McDowell said, he’s inclined to believe IBM’s promises as it has been very deliberate about keeping Red Hat’s storage technologies open-source.

“Red Hat OpenShift Data Foundation and Ceph will still be available as they always have, though its evolution will undoubtedly be more strongly guided by the needs of IBM’s storage business,” the analyst continued. “Overall this is a net positive for IBM and its customers. It makes good business sense and there should be minimal impact to Red Hat’s existing community.”

IBM said the first storage solutions to launch under the new IBM Ceph Storage and IBM Spectrum Fusion banners will arrive in the first half of 2023, so users will have plenty of time to digest the changes.

Image: Red Hat

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Wed, 05 Oct 2022 20:58:00 -0500 en-US text/html https://siliconangle.com/2022/10/04/ibm-merges-data-storage-offerings-red-hats-openshift-ceph/
000-276 exam dump and training guide direct download
Training Exams List